return to table of content

Apple introduces M4 chip

praseodym
229 replies
6h47m

"With these improvements to the CPU and GPU, M4 maintains Apple silicon’s industry-leading performance per watt. M4 can deliver the same performance as M2 using just half the power. And compared with the latest PC chip in a thin and light laptop, M4 can deliver the same performance using just a fourth of the power."

That's an incredible improvement in just a few years. I wonder how much of that is Apple engineering and how much is TSMC improving their 3nm process.

izacus
131 replies
6h31m

Apple usually massively exaggerates their tech spec comparison - is it REALLY half the power use of all times (so we'll get double the battery life) or is it half the power use in some scenarios (so we'll get like... 15% more battery life total) ?

cletus
69 replies
5h52m

IME Apple has always been the most honest when it makes performance claims. LIke when they said an Macbook Air would last 10+ hours and third-party reviewers would get 8-9+ hours. All the while, Dell or HP would claim 19 hours and you'd be lucky to get 2 eg [1].

As for CPU power use, of course that doesn't translate into doubling battery life because there are other components. And yes, it seems the OLED display uses more power so, all in all, battery life seems to be about the same.

I'm interested to see an M3 vs M4 performance comparison in the real world. IIRC the M3 was a questionable upgrade. Some things were better but some weren't.

Overall the M-series SoCs have been an excellent product however.

[1]: https://www.laptopmag.com/features/laptop-battery-life-claim...

EDIT: added link

ajross
46 replies
5h43m

IME Apple has always been the most honest when it makes performance claims

That's just laughable, sorry. No one is particularly honest in marketing copy, but Apple is for sure one of the worst, historically. Even more so when you go back to the PPC days. I still remember Jobs on stage talking about how the G4 was the fasted CPU in the world when I knew damn well that it was half the speed of the P3 on my desk.

brokencode
24 replies
5h0m

Have any examples from the past decade? Especially in the context of how exaggerated the claims are from PC and Android brands they are competing with?

lynndotpy
21 replies
4h33m

Apple recently claimed that RAM in their Macbooks is equivalent to 2x the RAM in any other machine, in defense of the 8GB starting point.

In my experience, I can confirm that this is just not true. The secret is heavy reliance on swap. It's still the case that 1GB = 1GB.

dmitrygr
8 replies
4h14m

The secret is heavy reliance on swap

You are entirely (100%) wrong, but, sadly, NDA...

sudosysgen
2 replies
3h57m

Memory compression isn't magic and isn't exclusive to macOS.

dmitrygr
1 replies
3h23m

I suggest you go and look HOW it is done in apple silicon macs, and then think long and hard why this might make a huge difference. Maybe Asahi Linux guys can explain it to you ;)

sudosysgen
0 replies
3h2m

I understand that it can make a difference to performance (which is already baked into the benchmarks we look at), I don't see how it can make a difference to compression ratios, if anything in similar implementations (ex: console APUs) it tends to lead to worse compression ratios.

If there's any publicly available data to the contrary I'd love to read it. Anecdotally I haven't seen a significant difference between zswap on Linux and macOS memory compression in terms of compression ratios, and on the workloads I've tested zswap tends to be faster than no memory compression on x86 for many core machines.

monsieurbanana
2 replies
4h1m

Regardless of what you can't tell, he's absolutely right regarding Apple's claims: saying that a 8gb mac is as good as a 16gb non-mac is laughable.

dmitrygr
0 replies
3h22m

That was never said. They said 8gb mac is similar to a 16gb non-Mac

Zanfa
0 replies
1h31m

My entry-level 8GB M1 Macbook Air beats my 64GB 10-core Intel iMac in my day-to-day dev work.

lynndotpy
0 replies
3h54m

I do admit the "reliance on swap" thing is speculation on my part :)

My experience is that I can still tell when the OS is unhappy when I demand more RAM than it can give. MacOS is still relatively responsive around this range, which I just attributed to super fast swapping. (I'd assume memory compression too, but I usually run into this trouble when working with large amounts of poorly-compressible data.)

In either case, I know it's frustrating when someone is confidently wrong but you can't properly correct them, so you have my apologies

ethanwillis
0 replies
4h5m

How convenient :)

rahkiin
5 replies
4h4m

There is also memory compression and their insane swap speed due to SoC memory and ssd

anaisbetts
4 replies
3h28m

Every modern operating system now does memory compression

astrange
3 replies
3h23m

Some of them do it better than others though.

Rinzler89
2 replies
2h29m

Apple uses Magic Compression.

throwaway744678
0 replies
40m

This is a revolution

adamomada
0 replies
56m

Not sure what windows does but the popular method on e.g. fedora is to split memory into main and swap and then compress swap. It could be more efficient the way Apple does it by not having to partition main memory.

seec
2 replies
2h52m

Ye that was hilarious, my basic workload borders on the 8GB limit not even pushing it. They have fast swap but nothing beats real ram in the end, and considering their storage pricing is as stupid as their RAM pricing it really makes no difference.

If you go for the base model, you are in for a bad time, 256GB with heavy swap and no dedicated GPU memory (making the 8GB even worse) is just plain stupid.

This what the Apple fanboys don't seem to get, their base model at somewhat affordable price are deeply incompetent and if you start to load it up the pricing just do not make a lot of sense...

n9
0 replies
27m

You know that RAM in these machines is more different than the same as "RAM" in a standard PC? Apple's SoC RAM is more or less part of the CPU/GPU and is super fast. And for obvious reasons cannot be added to.

Anyway, I manage a few M1 and M3 machines with 256/8 configs and they all run just as fast as 16 and 32 machines EXCEPT for workloads that need more than 8GB for a process (virtualization) or workloads that need lots of video memory (Lightroom can KILL an 8GB machine that isn't doing anything else...)

The 8GB is stupid discussion isn't "wrong" in the general case, but it is wrong for maybe 80% of users.

kcartlidge
0 replies
15m

If you go for the base model, you are in for a bad time, 256GB with heavy swap and no dedicated GPU memory (making the 8GB even worse) is just plain stupid ... their base model at somewhat affordable price are deeply incompetent

I got the base model M1 Air a couple of years back and whilst I don't do much gaming I do do C#, Python, Go, Rails, local Postgres, and more. I also have a (new last year) Lenovo 13th gen i7 with 16GB RAM running Windows 11 and the performance with the same load is night and day - the M1 walks all over it whilst easily lasting 10hrs+.

Note that I'm not a fanboy; I run both by choice. Also both iPhone and Android.

The Windows laptop often gets sluggish and hot. The M1 never slows down and stays cold. There's just no comparison (though the Air keyboard remains poor).

I don't much care about the technical details, and I know 8GB isn't a lot. I care about the experience and the underspecced Mac wins.

brokencode
2 replies
4h7m

Sure, and they were widely criticized for this. Again, the assertion I was responding to is that Apple does this ”laughably” more than competitors.

Is an occasional statement that they get pushback on really worse than what other brands do?

As an example from a competitor, take a look at the recent firestorm over Intel’s outlandish anti-AMD marketing:

https://wccftech.com/intel-calls-out-amd-using-old-cores-in-...

ajross
1 replies
4h4m

Sure, and they were widely criticized for this. Again, the assertion I was responding to is that Apple does this ”laughably” more than competitors.

FWIW: the language upthread was that it was laughable to say Apple was the most honest. And I stand by that.

brokencode
0 replies
3h49m

Fair point. Based on their first sentence, I mischaracterized how “laughable” was used.

Though the author also made clear in their second sentence that they think Apple is one of the worst when it comes to marketing claims, so I don’t think your characterization is totally accurate either.

cwillu
1 replies
4h6m

If someone is claiming “‹foo› has always ‹barred›”, then I don't think it's fair to demand a 10 year cutoff on counter-evidence.

bee_rider
0 replies
3h22m

Clearly it isn’t the case that Apple has always been more honest than their competition, because there were some years before Apple was founded.

leptons
7 replies
5h6m

Apple marketed their PPC systems as "a supercomputer on your desk", but it was nowhere near the performance of a supercomputer of that age. Maybe similar performance to a supercomputer from the 1970's, but that was their marketing angle from the 1990's.

georgespencer
3 replies
4h16m

Apple marketed their PPC systems as "a supercomputer on your desk"

It's certainly fair to say that twenty years ago Apple was marketing some of its PPC systems as "the first supercomputer on a chip"[^1].

but it was nowhere near the performance of a supercomputer of that age.

That was not the claim. Apple did not argue that the G4's performance was commensurate with the state of the art in supercomputing. (If you'll forgive me: like, fucking obviously? The entire reason they made the claim is precisely because the latest room-sized supercomputers with leapfrog performance gains were in the news very often.)

The claim was that the G4 was capable of sustained gigaflop performance, and therefore met the narrow technical definition of a supercomputer.

You'll see in the aforelinked marketing page that Apple compared the G4 chip to UC Irvine’s Aeneas Project, which in ~2000 was delivering 1.9 gigaflop performance.

This chart[^2] shows the trailing average of various subsets of super computers, for context.

This narrow definition is also why the machine could not be exported to many countries, which Apple leaned into.[^3]

Maybe similar performance to a supercomputer from the 1970's

What am I missing here? Picking perhaps the most famous supercomputer of the mid-1970s, the Cray-1,[^4] we can see performance of 160 MFLOPS, which is 160 million floating point operations per second (with an 80 MHz processor!).

The G4 was capable of delivering ~1 GFLOP performance, which is a billion floating point operations per second.

Are you perhaps thinking of a different decade?

[^1]: https://web.archive.org/web/20000510163142/http://www.apple....

[^2]: https://en.wikipedia.org/wiki/History_of_supercomputing#/med...

[^3]: https://web.archive.org/web/20020418022430/https://www.cnn.c...

[^4]: https://en.wikipedia.org/wiki/Cray-1#Performance

leptons
2 replies
3h53m

That was not the claim. Apple did not argue that the G4's performance was commensurate with the state of the art in supercomputing.

This is marketing we're talking about, people see "supercomputer on a chip" and they get hyped up by it. Apple was 100% using the "supercomputer" claim to make their luddite audience think they had a performance advantage, which they did not.

The entire reason they made the claim is

The reason they marketed it that way was to get people to part with their money. Full stop.

In the first link you added, there's a photo of a Cray supercomputer, which makes the viewer equate Apple = Supercomputer = I am a computing god if I buy this product. Apple's marketing has always been a bit shady that way.

And soon after that period Apple jumped off the PPC architecture and onto the x86 bandwagon. Gimmicks like "supercomputer on a chip" don't last long when the competition is far ahead.

threeseed
0 replies
2h29m

I can't believe Apple is marketing their products in a way to get people to part with their money.

If I had some pearls I would be clutching them right now.

georgespencer
0 replies
23m

This is marketing we're talking about, people see "supercomputer on a chip" and they get hyped up by it.

That is also not in dispute. I am disputing your specific claim that Apple somehow suggested that the G4 was of commensurate performance to a modern supercomputer, which does not seem to be true.

Apple was 100% using the "supercomputer" claim to make their luddite audience think they had a performance advantage, which they did not.

This is why context is important (and why I'd appreciate clarity on whether you genuinely believe a supercomputer from the 1970s was anywhere near as powerful as a G4).

In the late twentieth and early twenty-first century, megapixels were a proxy for camera quality, and megahertz were a proxy for processor performance. More MHz = more capable processor.

This created a problem for Apple, because the G4's SPECfp_95 (floating point) benchmarks crushed Pentium III at lower clock speeds.

PPC G4 500 MHz - 22.6

PPC G4 450 MHz - 20.4

PPC G4 400 MHz - 18.36

Pentium III 600 MHz – 15.9

For both floating point and integer benchmarks, the G3 and G4 outgunned comparable Pentium II/III processors.

You can question how this translates to real world use cases – the Photoshop filters on stage were real, but others have pointed out in this thread that it wasn't an apples-to-apples comparison vs. Wintel – but it is inarguable that the G4 had some performance advantages over Pentium at launch, and that it met the (inane) definition of a supercomputer.

The reason they marketed it that way was to get people to part with their money. Full stop.

Yes, marketing exists to convince people to buy one product over another. That's why companies do marketing. IMO that's a self-evidently inane thing to say in a nested discussion of microprocessor architecture on a technical forum – especially when your interlocutor is establishing the historical context you may be unaware of (judging by your comment about supercomputers from the 1970s, which I am surprised you have not addressed).

I didn't say "The reason Apple markets its computers," I said "The entire reason they made the claim [about supercomputer performance]…"

Both of us appear to know that companies do marketing, but only you appear to be confused about the specific claims Apple made – given that you proactively raised them, and got them wrong – and the historical backdrop against which they were made.

In the first link you added, there's a photo of a Cray supercomputer

That's right. It looks like a stylized rendering of a Cray-1 to me – what do you think?

which makes the viewer equate Apple = Supercomputer = I am a computing god if I buy this product

The Cray-1's compute, as measured in GFLOPS, was approximately 6.5x lower than the G4 processor.

I'm therefore not sure what your argument is: you started by claiming that Apple deliberately suggested that the G4 had comparable performance to a modern supercomputer. That isn't the case, and the page you're referring to contains imagery of a much less performant supercomputer, as well as a lot of information relating to the history of supercomputers (and a link to a Forbes article).

Apple's marketing has always been a bit shady that way.

All companies make tradeoffs they think are right for their shareholders and customers. They accentuate the positives in marketing and gloss over the drawbacks.

Note, too, that Adobe's CEO has been duped on the page you link to. Despite your emphatic claim:

Apple was 100% using the "supercomputer" claim to make their luddite audience think they had a performance advantage, which they did not.

The CEO of Adobe is quoted as saying:

“Currently, the G4 is significantly faster than any platform we’ve seen running Photoshop 5.5,” said John E. Warnock, chairman and CEO of Adobe.

How is what you are doing materially different to what you accuse Apple of doing?

And soon after that period Apple jumped off the PPC architecture and onto the x86 bandwagon.

They did so when Intel's roadmap introduced Core Duo, which was significantly more energy-efficient than Pentium 4. I don't have benchmarks to hand, but I suspect that a PowerBook G5 would have given the Core Duo a run for its money (despite the G5 being significantly older), but only for about fifteen seconds before thermal throttling and draining the battery entirely in minutes.

galad87
1 replies
4h50m

From https://512pixels.net/2013/07/power-mac-g4/: the ad was based on the fact that Apple was forbidden to export the G4 to many countries due to its “supercomputer” classification by the US government.

Vvector
0 replies
3h31m

Blaming a company TODAY for marketing from the 1990s is crazy.

mort96
3 replies
5h31m

Interesting, by what benchmark did you compare the G4 and the P3?

I don't have a horse in this race, Jobs lied or bent the truth all the time so it wouldn't surprise me, I'm just curious.

dblohm7
2 replies
5h13m

I remember that Apple used to wave around these SIMD benchmarks showing their PowerPC chips trouncing Intel chips. In the fine print, you'd see that the benchmark was built to use AltiVec on PowerPC, but without MMX or SSE on Intel.

0x457
1 replies
4h42m

Ah so the way Intel advertises their chips. Got it.

mort96
0 replies
4h15m

Yeah, and we rightfully criticize Intel for the same and we distrust their benchmarks

dijit
2 replies
5h37m

You can claim Apple is dishonest for a few reasons.

1) Graphs often are unannotatted.

2) Comparisons are rarely against latest generation products. (their argument for that has been that they do not expect people to upgrade yearly, so its showing the difference of their intended upgrade path).

3) They have conflated performance, for performance per watt.

However, when it comes to battery life, performance (for a task) or specification of their components (screens, ability to use external displays up to 6k, port speed etc) there are almost no hidden gotchas and they have tended to be trustworthy.

The first wave of M1 announcements were met with similar suspicion as you have shown here; but it was swiftly dispelled once people actually got their hands on them.

*EDIT:* Blaming a guy who's been dead for 13 years for something they said 50 years ago, and primarily it seems for internal use is weird. I had to look up the context but it seems it was more about internal motivation in the 70’s than relating to anything today, especially when referring to concrete claims.

Brybry
1 replies
4h47m

"This thing is incredible," Jobs said. "It's the first supercomputer on a chip.... We think it's going to set the industry on fire."

"The G4 chip is nearly three times faster than the fastest Pentium III"

- Steve Jobs (1999) [1]

[1] https://www.wired.com/1999/08/lavish-debut-for-apples-g4/

dijit
0 replies
4h44m

Thats cool, but literally last millennium.

And again, the guy has been dead for the better part of this millennium.

What have they shown of any product currently on the market, especially when backed with any concrete claim, that has been proven untrue-

EDIT: After reading your article and this one: https://lowendmac.com/2006/twice-as-fast-did-apple-lie-or-ju... it looks like it was true in floating point workloads.

zeroonetwothree
1 replies
5h40m

Indeed. Have we already forgotten about the RDF?

coldtea
0 replies
5h10m

No, it was just always a meaningless term...

windowsrookie
0 replies
4h27m

While certainly misleading, there were situations where the G4 was incredibly fast for the time. I remember being able to edit Video in iMove on a 12" G4 Laptop. At that time there was no equivalent x86 machine.

n9
0 replies
31m

Worked in an engineering lab at the time of the G4 introduction and I can contest that the G4 was a very, very fast CPU for scientific workloads.

Confirmed here: https://computer.howstuffworks.com/question299.htm (and elsewhere.)

A year later I was doing bonkers (for the time) photoshop work on very large compressed tiff files and my G4 laptop running at 400Mhz was more than 2x as fast as PIIIs on my bench.

Was it faster all around? I don't know how to tell. Was Apple as honest as I am in this commentary about how it mattered what you were doing? No. Was it a CPU that was able to do some things very fast vs others? I know it was.

mc32
0 replies
5h26m

Didn’t he have to use two PPC procs to get the equivalent perf you’d get on a P3?

Just add them up, it’s the same number of Hertz!

But Steve that’s two procs vs one!

I think this is when Adobe was optimizing for Windows/intel and was single threaded, but Steve put out some graphs showing better perf on the Mac.

jmull
0 replies
3h41m

If you have to go back 20+ years for an example…

bvrmn
5 replies
3h54m

BTW I get 19 hours from DELL XPS and Latitude. It's Linux with custom DE and Vim as IDE though.

wklm
2 replies
3h1m

can you share more details about your setup?

bvrmn
1 replies
2h49m

Archlinux, mitigations (spectre alike) off, X11, OpenBox, bmpanel with only CPU/IO indicator. Light theme everywhere. Opera in power save mode. `powertop --auto-tune` and `echo 1 | sudo tee /sys/devices/system/cpu/intel_pstate/no_turbo` Current laptop is Latitude 7390.

ribit
0 replies
56m

Right, so you are disabling all performance features and effectively turning your CPU into a low–end low–power SKU. Of course you’d get better battery life. It’s not the same thing though.

bee_rider
0 replies
3h25m

This is why Apple can be slightly more honest about their battery specs, they don’t have the OS working against them. Unfortunately most DELLs XPS will be running Windows, so it is still misleading to provide specs based on what the hardware could do if not sabotaged.

amarka
0 replies
3h33m

I get about 21 hours from mine, it's running Windows but powered off.

moogly
4 replies
4h26m

IME Apple has always been the most honest when it makes performance claims.

I guess you weren't around during the PowerPC days... Because that's a laughable statement.

oblio
1 replies
3h54m

I have no idea who's down voting you. They were lying through their teeth about CPU performance back then.

A PC half the price was smoking their top of the line stuff.

seec
0 replies
2h58m

That's funny you say that, because this is precisely the time, I started buying Macs (I got a Pismo PowerBook G3 gifted and then bought an iBook G4). And my experience was that for sure, if you put as much money into a PC than in a Mac you would get MUCH better performance.

What made it worth it at the time (I felt) was the software. Today I'm really don't think so, software has improved overall in the industry and there is not a lot of things "Mac specific" that makes it a clear-cut choice.

As for the performance I can't believe all the Apple silicon hype. Sure, it gets good battery life given you use strictly Apple software (or software optimized for it heavily) but in mixed workload situation it's not that impressive.

Using the M2 MacBook Pro of a friend I figured I could get maybe 4-5 hours out of its best case scenario which is better than the 2-3 hours you would get from a PC laptop but also not that great considering the price difference.

And when it comes to performance it is extremely unequal and very lackluster for many things. Like there is more lag launching Activity Monitor on a 2K++ MacBook Pro than launching task manager on a 500 PC. This is a small somewhat stupid example but it does tell the overall story.

They talk a big game but in reality, their stuff isn't that performant in the real world.

And they still market games when one of their 2K laptops plays Dota 2 (a very old, relatively ressource efficient game) worse than a cheapo PC.

imwillofficial
0 replies
4h20m

All I remember is tanks in the commercials.

We need more tanks in commercials.

dylan604
0 replies
2h28m

Oh those megahertz myths! Their marketing department is pretty amazing at their spin control. This one was right up there with "it's not a bug; it's a feature" type of spin.

Aurornis
2 replies
4h56m

ME Apple has always been the most honest when it makes performance claims.

Okay, but your example was about battery life:

LIke when they said an Macbook Air would last 10+ hours and third-party reviewers would get 8-9+ hours. All the while, Dell or HP would claim 19 hours and you'd be lucky to get 2 eg [1]

And even then, they exaggerated their claims. And your link doesn't say anything about HP or Dell claiming 19 hour battery life.

Apple has definitely exaggerated their performance claims over and over again. The Apple silicon parts are fast and low power indeed, but they've made ridiculous claims like comparing their chips to an nVidia RTX 3090 with completely misleading graphs

Even the Mac sites have admitted that the nVidia 3090 comparison was completely wrong and designed to be misleading: https://9to5mac.com/2022/03/31/m1-ultra-gpu-comparison-with-...

This is why you have to take everything they say with a huge grain of salt. Their chip may be "twice" as power efficient in some carefully chosen unique scenario that only exists in an artificial setting, but how does it fare in the real world? That's the question that matters, and you're not going to get an honest answer from Apple's marketing team.

vel0city
0 replies
1h29m

You're right, its not 19 hours claimed. It was more than even that.

HP gave the 13-inch HP Spectre x360 an absurd 22.5 hours of estimated battery life, while our real-world test results showed that the laptop could last for 12 hours and 7 minutes.
ribit
0 replies
59m

M1 Ultra did benchmark close to 3090 in some synthetic gaming tests. The claim was not outlandish, just largely irrelevant for any reasonable purpose.

Apple does usually explain their testing methodology and they don’t cheat on benchmarks like some other companies. It’s just that the results are still marketing and should be treated as such.

Outlandish claims notwithstanding, I don’t think anyone can deny the progress they achieved with their CPU and especially GPU IP. Improving performance on complex workloads by 30–50% in a single year is very impressive.

moooo99
1 replies
4h16m

They are pretty honest when it comes to battery life claims, they’re less honest when it comes to benchmark graphs

underlogic
0 replies
3h41m

I don't think less honest covers it and can't believe anything their marketing says after the 3090 claims. Maybe it's true, maybe not. We'll see from the reviews. Well assuming the reviewers weren't paid off with an "evaluation unit".

treflop
0 replies
50m

Apple is always honest but they know how to make you believe something that isn’t true.

syncsynchalt
0 replies
4h23m

IME Apple has always been the most honest when it makes performance claims

Yes and no. They'll always be honest with the claim, but the scenario for the claimed improvement will always be chosen to make the claim as large as possible, sometimes with laughable results.

Typically something like "watch videos for 3x longer <small>when viewing 4k h265 video</small>" (which means they adapted the previous gen's silicon which could only handle h264).

nabakin
0 replies
2h53m

Maybe for battery life, but definitely not when it comes to CPU/GPU performance. Tbf, no chip company is, but Apple is particularly egregious. Their charts assume best case multi-core performance when users rarely ever use all cores at once. They'd have you thinking it's the equivalent of a 3090 or that you get double the frames you did before when the reality is more like 10% gains.

izacus
0 replies
3h22m

LIke when they said an Macbook Air would last 10+ hours and third-party reviewers would get 8-9+ hours.

For literal YEARS, Apple battery life claims were a running joke on how inaccurate and overinflated they were.

dylan604
0 replies
2h29m

Yeah, the assumption seems to be that using less battery by one component means that the power will just magically go unused. As with everything else in life, as soon as something stops using a resource something else fills the vacuum to take advantage of the resource.

bee_rider
0 replies
3h28m

Controlling the OS is probably a big help there. At least, I saw lots of complaints about my zenbook model’s battery not hitting the spec. It was easy to hit or exceed it in Linux, but you have to tell it not to randomly spin up the CPU.

philistine
39 replies
6h20m

Quickly looking at the press release, it seems to have the same comparisons as in the video. None of Apple's comparisons today are between the M3 and M4. They are ALL comparing the M2 and M4. Why? It's frustrating, but today Apple replaced a product with an M2 with a product with an M4. Apple always compares product to product, never component to component when it comes to processors. So those specs are far more impressive than if we could have numbers between the M3 and M4.

jorvi
8 replies
6h15m

Didn't they do extreme nitpicking for their tests so they could show the M1 beating a 3090 (or M2 a 4090, I can't remember).

Gave me quite a laugh when Apple users started to claim they'd be able to play Cyberpunk 2077 maxed out with maxed out raytracing.

philistine
6 replies
6h5m

I'll give you that Apple's comparisons are sometimes inscrutable. I vividly remember that one.

https://www.theverge.com/2022/3/17/22982915/apple-m1-ultra-r...

Apple was comparing the power envelope (already a complicated concept) of their GPU against a 3090. Apple wanted to show that the peak of their GPU's performance was reached with a fraction of the power of a 3090. What was terrible was that Apple was cropping their chart at the point where the 3090 was pulling ahead in pure compute by throwing more watts at the problem. So their GPU was not as powerful as a 3090, but a quick glance at the chart would completely tell you otherwise.

Ultimately we didn't see one of those charts today, just a mention about the GPU being 50% more efficient than the competition. I think those charts are beloved by Johny Srouji and no one else. They're not getting the message across.

izacus
5 replies
5h42m

Plenty of people on HN thought that M1 GPU is as powerful as 3090 GPU, so I think the message worked very well for Apple.

They really love those kind of comparisons - e.g. they also compared M1s against really old Intel CPUs to make the numbers look better, knowing that news headlines won't care for details.

philistine
3 replies
5h37m

They compared against really old intel CPUs because those were the last ones they used in their own computers! Apple likes to compare device to device, not component to component.

oblio
1 replies
3h46m

You say that like it's not a marketing gimmick meant to mislead and obscure facts.

It's not some virtue that causes them to do this.

threeseed
0 replies
2h34m

It's funny because your comment is meant to mislead and obscure facts.

Apple compared against Intel to encourage their previous customers to upgrade.

There is nothing insidious about this and is in fact standard business practice.

izacus
0 replies
3h20m

No, they compared it because it made them look way better for naive people. They have no qualms comparing to other competition when it suits them.

You're explanation is a really baffling case of corporate white knighting.

w0m
0 replies
5h28m

not component to component

that's honestly kind of stupid when discussing things like 'new CPU!' like this thread.

I'm not saying the M4 isn't a great platform, but holy cow the corporate tripe people gobble up.

refulgentis
0 replies
6h4m

Yes, can't remember the precise combo either, there was a solid year or two of latent misunderstandings.

I eventually made a visual showing it was the same as claiming your iPhone was 3x the speed of a Core i9: Sure, if you limit the power draw of your PC to a battery the size of a post it pad.

Similar issues when on-device LLMs happened, thankfully, quieted since then (last egregious thing I saw was stonk-related wishcasting that Apple was obviously turning its Xcode CI service into a full-blown AWS competitor that'd wipe the floor with any cloud service, given the 2x performance)

sod
6 replies
5h54m

Yes, kinda annoying. But on the other hand, given that apple releases a new chip every 12 months, we can grant them some slack here. Given that from AMD, Intel or nvidia we see usually a 2 year cadence.

dartos
4 replies
5h47m

There’s probably easier problems to solve in the ARM space than x86 considering the amount of money and time spent on x86.

That’s not to say that any of these problems are easy, just that there’s probably more lower hanging fruit in ARM land.

kimixa
3 replies
5h10m

And yet they seem to be the only people picking the apparently "Low Hanging Fruit" in ARM land. We'll see about Qualcomm's Nuvia-based stuff, but that's been "nearly released" for what feels like years now, but you still can't buy one to actually test.

And don't underestimate the investment Apple made - it's likely at a similar level to the big x86 incumbents. I mean AMD's entire Zen development team cost was likely a blip on the balance sheet for Apple.

0x457
0 replies
3h30m

Good to know that it's finally seeing the light. I thought they're still in legal dispute with ARM about Nuvia's design?

re-thc
0 replies
4h59m

We'll see about Qualcomm's Nuvia-based stuff, but that's been "nearly released" for what feels like years now, but you still can't buy one to actually test.

That's more bound by legal than technical reasons...

blackoil
0 replies
4h54m

Maybe for GPUs, but for CPU both intel and AMD release with yearly cadance. Even when Intel has nothing new to release, generation is bumped.

MBCook
5 replies
5h8m

It’s an iPad event and there were no M3 iPads.

That’s all. They’re trying to convince iPad users to upgrade.

We’ll see what they do when they get to computers later this year.

epolanski
4 replies
3h45m

I have a Samsung Galaxy S7 FE tablet, and I can't figure any use case where I may use more power.

I agree that iPad has more interesting software than android for use cases like video or music editing, but I don't do those on a tablet anyway.

I just can't imagine anyone updating their ipad M2 for this except a tiny niche that really wants that more power.

r00fus
1 replies
3h7m

AI on the device may be the real reason for an M4.

MBCook
0 replies
1h19m

Previous iPads have had that for a long time. Since the A12 in 2018. The phones had it even earlier with the A11.

Sure this is faster but enough to make people care?

It may depend heavily on what they announce is in the next version of iOS/iPadOS.

MBCook
1 replies
3h16m

The A series was good enough.

I’m vaguely considering this but entirely for the screen. The chip has been irrelevant to me for years, it’s long past the point where I don’t notice it.

nomel
0 replies
14m

A series was definitely not good enough. Really depends on what you're using it for. Netflix and web? Sure. But any old HDR tablet, that can maintain 24Hz, is good enough for that.

These are 2048x2732 with 120Hz displays, that support 6k external displays. Gaming and art apps push them pretty hard. From the iPad user in my house, goin from the 2020 non M* iPad to a 2023 M2 iPad made a huge difference for the drawing apps. Better latency is always better for drawing, and complex brushes (especially newer ones), selections, etc, would get fairly unusable.

For gaming, it was pretty trivial to dip well below 60Hz with a non M* iPad, with some of the higher demand games like Fortnight, Minecraft (high view distance), Roblox (it ain't what it used to be), etc.

But, the apps will always gravitate to the performance of the average user. A step function in performance won't show up in the apps until the adoption follows, years down the line. Not pushing the average to higher performance is how you stagnate the future software of the devices.

kiba
4 replies
6h12m

I like the comparison between much older hardware with brand new to highlight how far we came.

chipdart
3 replies
6h3m

I like the comparison between much older hardware with brand new to highlight how far we came.

That's ok, but why skip the previous iteration then? Isn't the M2 only two generations behind? It's not that much older. It's also a marketing blurb, not a reproducible benchmark. Why leave out comparisons with the previous iteration even when you're just hand-waving over your own data?

FumblingBear
2 replies
5h45m

In this specific case, it's because iPad's never got the M3. They're literally comparing it with the previous model of iPad.

There were some disingenuous comparisons throughout the presentation going back to A11 for the first Neural Engine and some comparisons to M1, but the M2 comparison actually makes sense.

philistine
1 replies
5h35m

I wouldn't call the comparison to A11 disingenuous, they were very clear they were talking about how far their neural engines have come, in the context of the competition just starting to put NPUs in their stuff.

I mean, they compared the new iPad Pro to an iPod Nano, that's just using your own history to make a point.

FumblingBear
0 replies
5h32m

Fair point—I just get a little annoyed when the marketing speak confuses the average consumer and felt as though some of the jargon they used could trip less informed customers up.

mlsu
3 replies
5h6m

They know that anyone who has bought an M3 is good on computers for a long while. They're targeting people who have m2 or older macs. People who own an m3 are basically going to buy anything that comes down the pipe, because who needs an m3 over an m2 or even an m1 today?

abnercoimbre
2 replies
4h26m

I’m starting to worry that I’m missing out on some huge gains (M1 Air user.) But as a programmer who’s not making games or anything intensive, I think I’m still good for another year or two?

richiebful1
0 replies
3h48m

I have an M1 Air and I test drove a friend's recent M3 Air. It's not very different performance-wise for what I do (programming, watching video, editing small memory-constrained GIS models, etc)

mlsu
0 replies
30m

I wanted to upgrade my M1 because it was going to swap a lot with only 8 gigs of RAM and because I wanted a machine that could run big LLMs locally. Ended up going 8G macbook air M1 -> 64G macbook pro M1. My other reasoning was that it would speed up compilation, which it has, but not by too much.

The M1 air is a very fast machine and is perfect for anyone doing normal things on the computer.

yieldcrv
2 replies
6h5m

personally I think this is a comparison most people want. The M3 had a lot of compromises over the M2.

that aside, the M4 is about the Neural Engine upgrades over anything (which probably should have been compared to the M3)

dakiol
1 replies
4h46m

What are such compromises? I may buy an M3 mbp, so would like to hear more

fh9302
0 replies
4h39m

The M3 Pro had some downgrades compared to the M2 Pro, less performance cores and lower memory bandwidth. This did not apply to the M3 and M3 Max.

raydev
0 replies
4h32m

They are ALL comparing the M2 and M4. Why?

Well, the obvious answer is that those with older machines are more likely to upgrade than those with newer machines. The market for insta-upgraders is tiny.

edit: And perhaps an even more obvious answer: there are no iPads that contained the M3, so the comparison would be more useless. The M4 was just launched today exclusively in iPads.

mkl
0 replies
1h8m

Apple always compares product to product, never component to component when it comes to processors.

I don't think this is true. When they launched the M3 they compared primarily to M1 to make it look better.

mh8h
0 replies
4h51m

That's because the previous iPad Pros came with M2, not M3. They are comparing the performance with the previous generation of the same product.

loongloong
0 replies
4h12m

Doesn't seem plausible to me that Apple will release a "M3 variant" that can drive "tandem OLED" displays. So probably logical to package whatever chip progress (including process improvements) into "M4".

And it can signal that "We are serious about iPad as a computer", using their latest chip.

Logical alignment to progresses in engineering (and manufacturing) packaged smartly to generate marketing capital for sales and brand value creation.

Wonder how the newer Macs will use these "tandem OLED" capabilities of the M4.

homarp
0 replies
6h13m

because previous ipad was M2. So 'remember how fast was your previous ipad', well this one is N better.

illusive4080
4 replies
6h25m

From their specs page, battery life is unchanged. I think they donated the chip power savings to offset the increased consumption of the tandem OLED

xattt
3 replies
6h14m

I’ve not seen discussion that Apple likely scales performance of chips to match the use profile of the specific device it’s used in. An M2 in an iPad Air is very likely not the same as an M2 in an MBP or Mac Studio.

refulgentis
0 replies
6h1m

Surprisingly, I think it is: I was going to comment that here, then checked Geekbench, single core scores match for M2 iPad/MacBook Pro/etc. at same clock speed. i.e. M2 "base" = M2 "base", but core count differs, and with the desktops/laptops, you get options for M2 Ultra Max SE bla bla.

mananaysiempre
0 replies
6h4m

A Ryzen 7840U in a gaming handheld is not (configured) the same as a Ryzen 7840U in a laptop, for that matter, so Apple is hardly unique here.

joakleaf
0 replies
5h56m

The GeekBench [1,2] benchmarks for M2 are:

Single Core: iPad Pro (M2): 2539 Macbook Air (M2): 2596 Macbook Pro (M2): 2645

Multi Core: iPad Pro (M2 8-core): 9631 Macbook Air (M2 8-core): 9654 Macbook Pro (M2 8-core): 9642

So, it appears to be almost the same performance (until it throttles due to heat, of course).

1. https://browser.geekbench.com/ios-benchmarks 2. https://browser.geekbench.com/mac-benchmarks

alphakappa
4 replies
6h29m

Any product that uses this is more than just the chip, so you cannot get a proportional change in battery life.

izacus
3 replies
6h24m

Sure, but I also remember them comparing M1 chip to 3090 GTX and my MacBook M1 Pro doesn't really run games well.

So I've become really suspicious about any claims about performance done by Apple.

servus45678981
0 replies
6h6m

That is fault of the devs. Because optimization for dedicated graphic cards is a either integrated in the game engine or they just have a version for rtx users.

filleduchaos
0 replies
5h14m

I mean, I remember Apple comparing the M1 Ultra to Nvidia's RTX 3090. While that chart was definitely putting a spin on things to say the least, and we can argue from now until tomorrow about whether power consumption should or should not be equalised, I have no idea why anyone would expect the M1 Pro (an explicitly much weaker chip) to perform anywhere near the same.

Also what games are you trying to play on it? All my M-series Macbooks have run games more than well enough with reasonable settings (and that has a lot more to do with OS bugs and the constraints of the form factor than with just the chipset).

achandlerwhite
0 replies
5h7m

They compared them in terms of perf/watt, which did hold up, but obviously implied higher performance overall.

mcv
3 replies
4h10m

I don't know, but the M3 MBP I got from work already gives the impression of using barely any power at all. I'm really impressed by Apple Silicon, and I'm seriously reconsidering my decision from years ago to never ever buy Apple again. Why doesn't everybody else use chips like these?

jacurtis
1 replies
4h1m

I have an M3 for my personal laptop and an M2 for my work laptop. I get ~8 hours if I'm lucky on my work laptop, but I have attributed most of that battery loss to all the "protection" software they put on my work laptop that is always showing up under the "Apps Using Significant Power" category in the battery dropdown.

I can have my laptop with nothing on screen, and the battery still points to TrendMicro and others as the cause of heavy battery drain while my laptop seemingly idles.

I recently upgraded my personal laptop to the M3 MacBook pro and the difference is astonishing. I almost never use it plugged in because I genuinely get close to that 20 hour reported battery life. Last weekend I played a AAA Video Game through Xbox Cloud Gaming (awesome for mac gamers btw) and with essentially max graphics (rendered elsewhere and streamed to me of course), I got sucked into a game for like 5 hours and lost only 8% of my battery during that time, while playing a top tier video game! It really blew my mind. I also use GoLand IDE on there and have managed to get a full day of development done using only about 25-30% battery.

So yeah, whatever Apple is doing, they are doing it right. Performance without all the spyware that your work gives you makes a huge difference too.

bee_rider
0 replies
3h17m

For the AAA video game example, I mean, it is interesting how far that kind of tech has come… but really that’s just video streaming (maybe slightly more difficult because latency matters?) from the point of view of the laptop, right? The quality of the graphics there have more-or-less nothing to do with the battery.

jhickok
0 replies
3h59m

I think the market will move to using chips like this, or at least have additional options. The new Snapdragon SOC is interesting, and I would suspect we could see Google and Microsoft play in this space at some point soon.

wwilim
0 replies
4h6m

Isn't 15% more battery life a huge improvement on a device already well known for long battery life?

smith7018
0 replies
4h42m

You wouldn’t necessarily get twice the battery life. It could be less than that due to the thinner body causing more heat, a screen that utilizes more energy, etc

michaelmior
0 replies
4h43m

is it REALLY half the power use of all times (so we'll get double the battery life)

I'm not sure what you mean by "of all times" but half the battery usage of the processor definitely doesn't translate into double the battery life since the processor is not the only thing consuming power.

coldtea
0 replies
5h12m

Apple might use simplified and opaque plots to drive their point, but they all too often undersell the differences. Indepedent reviews for example find that they not just hit the mark Apple mentions for things like battery but that often do slightly better...

can16358p
0 replies
2h42m

Apple is one of the few companies that underpromise and overdeliver and never exaggerate.

Compared to the competition, I'd trust Apple much more than the Windows laptop OEMs.

bagels
0 replies
5h19m

CPU is not the only way that power is consumed in a portable device. It is a large fraction, but you also have displays and radios.

aledalgrande
0 replies
6h21m

Well battery life would be used by other things too right? Especially by that double OLED screen. "best ever" in every keynote makes me laugh at this point, but it doesn't mean that they're not improving their power envelope.

cs702
49 replies
6h45m

Potentially > 2x greater battery life for the same amount of compute!

That is pretty crazy.

Or am I missing something?

eqvinox
19 replies
6h39m

Sadly, this is only processor power consumption, you need to put power into a whole lot of other things to make an useful computer… a display backlight and the system's RAM come to mind as particular offenders.

treesciencebot
16 replies
6h28m

backlight is now the main bottleneck for consumption heavy uses. I wonder what are the main advancements that are happening there to optimize the wattage.

neutronicus
7 replies
5h51m

Please give me an external ePaper display so I can just use Spacemacs in a well-lit room!

vsuperpower2020
3 replies
5h30m

I'm still waiting for the technology to advance. People can't reasonably spend $1500 on the world's shittiest computer monitor, even if it is on sale.

neutronicus
2 replies
5h3m

Dang, yeah, this is the opposite of what I had in mind

I was thinking, like, a couple hundred dollar Kindle the size of a big iPad I can plug into a laptop for text-editing out and about. Hell, for my purposes I'd love an integrated keyboard.

Basically a second, super-lightweight laptop form-factor I can just plug into my chonky Macbook Pro and set on top of it in high-light environments when all I need to do is edit text.

Honestly not a compelling business case now that I write it out, but I just wanna code under a tree lol

mholm
0 replies
3h57m

I think we're getting pretty close to this. The Remarkable 2 tablet is $300, but can't take video input and software support for non-notetaking is near non-existent. There's even a keyboard available. Boox and Hisense are also making e-ink tablets/phones for reasonable prices.

eqvinox
0 replies
4h22m

A friend bought it & I had a chance to see it in action.

It is nice for some very specific use cases. (They're in the publishing/typesetting business. It's… idk, really depends on your usage patterns.)

Other than that, yeah, the technology just isn't there yet.

craftkiller
0 replies
3h23m

If that existed as a drop-in screen replacement on the framework laptop and with a high refresh rate color gallery 3 panel, then I'd buy it at that price point in a heart beat.

I can't replace my desktop monitor with eink because I occasionally play video games. I can't use a 2nd monitor because I live in a small apartment.

I can't replace my laptop screen with greyscale because I need syntax highlighting for programming.

gumby
0 replies
4h29m

Maybe the $100 nano-texture screen will give you the visibility you want. Not the low power of a epaper screen though.

Hmm, emacs on an epaper screen might be great if it had all the display update optimization and "slow modem mode" that Emacs had back in the TECO days. (The SUPDUP network protocol even implemented that at the client end and interacted with Emacs directly!)

sangnoir
2 replies
6h18m

Is the iPad Pro not yet on OLED? All of Samsung's flagship tablets have OLED screens for well over a decade now. It eliminates the need for backlighting, has superior contrast and pleasant to ise in low-light conditions.

kbolino
0 replies
6h3m

I'm not sure how OLED and backlit LCD compare power-wise exactly, but OLED screens still need to put off a lot of light, they just do it directly instead of with a backlight.

callalex
0 replies
1h24m

The iPad that came out today finally made the switch. iPhones made the switch around 2016. It does seem odd how long it took for the iPad to switch, but Samsung definitely switched too early: my Galaxy Tab 2 suffered from screen burn in that I was never able to recover from.

whereismyacc
1 replies
5h55m

QD-oled reduces it by like 25% I think? But maybe that will never be in laptops, I'm not sure.

eqvinox
0 replies
4h7m

QD-OLED is an engineering improvement, i.e. combining existing researched technology to improve the result product. I wasn't able to find a good source on what exactly it improves in efficiency, but it's not a fundamental improvement in OLED electrical→optical energy conversion (if my understanding is correct.)

In general, OLED screens seem to have an efficiency around 20≈30%. Some research departments seem to be trying to bump that up [https://www.nature.com/articles/s41467-018-05671-x] which I'd be more hopeful on…

…but, honestly, at some point you just hit the limits of physics. It seems internal scattering is already a major problem; maybe someone can invent pixel-sized microlasers and that'd help? More than 50-60% seems like a pipe dream at this point…

…unless we can change to a technology that fundamentally doesn't emit light, i.e. e-paper and the likes. Or just LCD displays without a backlight, using ambient light instead.

craftkiller
1 replies
3h57m

AMD gpus have "Adaptive Backlight Management" which reduces your screen's backlight but then tweaks the colors to compensate. For example, my laptop's backlight is set at 33% but with abm it reduces my backlight to 8%. Personally I don't even notice it is on / my screen seems just as bright as before, but when I first enabled it I did notice some slight difference in colors so its probably not suitable for designers/artists. I'd 100% recommend it for coders though.

ProfessorLayton
0 replies
2h19m

Strangely, Apple seems to be doing the opposite for some reason (Color accuracy?), as dimming the display doesn't seem to reduce the backlight as much, and they're using a combination of software dimming, even at "max" brightness.

Evidence can be seen when opening up iOS apps, which seem to glitch out and reveals the brighter backlight [1]. Notice how #FFFFFF white isn't the same brightness as the white in the iOS app.

[1] https://imgur.com/a/cPqKivI

devsda
0 replies
6h17m

If the usecases involve working on dark terminals all day or watching movies with dark scenes or if the general theme is dark, may be the new oled display will help reduce the display power consumption too.

naikrovek
0 replies
6h11m

that's still amazing, to me.

I don't expect an M4 macbook to last any longer than an M2 macbook of otherwise similar specs; they will spend that extra power budget on things other than the battery life specification.

cs702
0 replies
6h29m

Thanks. That makes sense.

Hamuko
14 replies
6h36m

Comparing the tech specs for the outgoing and new iPad Pro models, that potential is very much not real.

Old: 28.65 Wh (11") / 40.88 Wh (13"), up to 10 hours of surfing the web on Wi-Fi or watching video.

New: 31.29 Wh (11") / 38.99 Wh (13"), up to 10 hours of surfing the web on Wi-Fi or watching video.

codedokode
6 replies
6h25m

Isn't this weird, a new chip consumes 2 times less power, but the battery life is the same?

rsynnott
3 replies
6h16m

The OLED likely adds a fair bit of draw; they're generally somewhat more power-hungry than LCDs these days, assuming like-for-like brightness. Realistically, this will be the case until MicroLEDs are available for non-completely-silly money.

CoastalCoder
2 replies
6h1m

This surprises me. I thought the big power downside of LCD displays is that they use filtering to turn unwanted color channels into waste heat.

Knowing nothing else about the technology, I assumed that would make OLED displays more efficient.

sroussey
0 replies
4h14m

OLED will use less for a screen of black and LCD will use less for a screen of white. Now, take whatever average of what content is on the screen and for you, it may be better or may be worse.

White background document editing, etc., will be worse, and this is rather common.

masklinn
0 replies
6h18m

It's not weird when you consider that browsing the web or watching videos has the CPU idle or near enough, so 95% of the power draw is from the display and radios.

beeboobaa3
0 replies
4h5m

No, they have a "battery budget". It the CPU power draw goes down that means the budget goes up and you can spend it on other things, like a nicer display or some other feature.

When you say "up to 10 hours" most people will think "oh nice that's an entire day" and be fine with it. It's what they're used to.

Turning that into 12 hours might be possible but are the tradeoffs worth it? Will enough people buy the device because of the +2 hour battery life? Can you market that effectively? Or will putting in a nicer fancy display cause more people to buy it?

We'll never get significant battery life improvements because of this, sadly.

masklinn
2 replies
6h19m

Yeah double the PPW does not mean double the battery, because unless you're pegging the CPU/SOC it's likely only a small fraction of the power consumption of a light-use or idle device, especially for an SOC which originates in mobile devices.

Doing basic web navigation with some music in the background, my old M1 Pro has short bursts at ~5W (for the entire SoC) when navigating around, a pair of watts for mild webapps (e.g. checking various channels in discord), and typing into this here textbox it's sitting happy at under half a watt, with the P-cores essentially sitting idle and the E cores at under 50% utilisation.

With a 100Wh battery that would be a "potential" of 150 hours or so. Except nobody would ever sell it for that, because between the display and radios the laptop's actually pulling 10~11W.

tracker1
0 replies
1h38m

On my M1 air, I find for casual use of about an hour or so a day, I can literally go close to a couple weeks without needing to recharge. Which to me is pretty awesome. Mostly use my personal desktop when not on my work laptop (docked m3 pro).

pxc
0 replies
4h11m

So this could be a bit helpful for heavier duty usage while on battery.

jeffbee
1 replies
6h28m

A more efficient CPU can't improve that spec because those workloads use almost no CPU time and the display dominates the energy consumption.

Hamuko
0 replies
6h20m

Unfortunately Apple only ever thinks about battery life in terms of web surfing and video playback, so we don't get official battery-life figures for anything else. Perhaps you can get more battery life out of your iPad Pro web surfing by using dark mode, since OLEDs should use less power than IPS displays with darker content.

fvv
0 replies
6h24m

this

binary132
0 replies
6h29m

Ok, but is it twice as fast during those 10 hours, leading to 20 hours of effective websurfing? ;)

krzyk
10 replies
6h40m

Wait a bit. M2 wasn't as good as the hype was.

modeless
8 replies
6h24m

That's because M2 was on the same TSMC process generation as M1. TSMC is the real hero here. M4 is the same generation as M3, which is why Apple's marketing here is comparing M4 vs M2 instead of M3.

mensetmanusman
2 replies
4h51m

Saying tsmc is a hero ignores the thousands of suppliers that improved everything required for tsmc to operate. Tsmc is the biggest, so they get the most experience on all the new toys the world’s engineers and scientists are building.

whynotminot
0 replies
4h18m

It's almost as if every part of the stack -- from the uArch that Apple designs down to the insane machinery from ASML, to the fully finished SoC delivered by TSMC -- is vitally important to creating a successful product.

But people like to assign credit solely to certain spaces if it suits their narrative (lately, Apple isn't actually all that special at designing their chips, it's all solely the process advantage)

modeless
0 replies
1h50m

Saying TSMC's success is due to their suppliers ignores the fact that all of their competitors failed to keep up despite having access to the same suppliers. TSMC couldn't do it without ASML, but Intel and Samsung failed to do it even with ASML.

In contrast, when Apple's CPU and GPU competitors get access to TSMC's new processes after Apple's exclusivity period expires, they achieve similar levels of performance (except for Qualcomm because they don't target the high end of CPU performance, but AMD does).

geodel
2 replies
5h36m

And why other PC vendors not latching on to the hero?

modeless
0 replies
1h53m

Apple pays TSMC for exclusivity on new processes for a period of time.

mixmastamyk
0 replies
5h25m

Apple often buys their entire capacity (of a process) for quite a while.

sys_64738
0 replies
5h47m

I thought M3 and M4 were different processes though. Higher yield for the latter or such.

jonathannorris
0 replies
5h44m

Actually, M4 is reportedly on a more cost-efficient TSMC N3E node, where Apple was apparently the only customer on the more expensive TSMC N3B node; I'd expect Apple to move away from M3 to M4 very quickly for all their products.

https://www.trendforce.com/news/2024/05/06/news-apple-m4-inc....

fallat
0 replies
6h39m

This. Remember folks Apple's primary goal is PROFIT. They will tell you anything appealing before independent tests are done.

BurningFrog
1 replies
6h27m

Is the CPU/GPU really dominating power consumption that much?

masklinn
0 replies
4h34m

Nah, GP is off their rocker. For the workloads in question the SOC's power draw is a rounding error, low single-digit percent.

mvkel
22 replies
6h24m

And here it is in an OS that can't even max out an M1!

That said, the function keys make me think "and it runs macOS" is coming, and THAT would be extremely compelling.

a_vanderbilt
20 replies
5h59m

We've seen a slow march over the last decade towards the unification of iOS and macOS. Maybe not a "it runs macOS", but an eventual "they share all the same apps" with adaptive UIs.

DrBazza
9 replies
5h40m

They probably saw the debacle that was Windows 8 and thought merging a desktop and touch OS is a decade-long gradual task, if that is even the final intention.

Unlike MS that went with the Big Bang in your face approach that was oh-so successful.

kmeisthax
2 replies
4h2m

You're right about the reason but wrong about the timeline: Jobs saw Windows XP Tablet Edition and built a skunkworks at Apple to engineer a tablet that did not require a stylus. This was purely to spite a friend[0] of his that worked at Microsoft and was very bullish on XP tablets.

Apple then later took the tablet demo technology, wrapped it up in a very stripped-down OS X with a different window server and UI library, and called it iPhone OS. Apple was very clear from the beginning that Fingers Can't Use Mouse Software, Damn It, and that the whole ocean needed to be boiled to support the new user interface paradigm[1]. They even have very specific UI rules specifically to ensure a finger never meets a desktop UI widget, including things like iPad Sidecar just not forwarding touch events at all and only supporting connected keyboards, mice, and the Apple Pencil.

Microsoft's philosophy has always been the complete opposite. Windows XP through 7 had tablet support that amounted to just some affordances for stylus users layered on top of a mouse-only UI. Windows 8 was the first time they took tablets seriously, but instead of just shipping a separate tablet OS or making Windows Phone bigger, they turned it into a parasite that ate the Windows desktop from the inside-out.

This causes awkwardness. For example, window management. Desktops have traditionally been implemented as a shared data structure - a tree of controls - that every app on the desktop can manipulate. Tablets don't support this: your app gets one[2] display surface to present their whole UI inside of[3], and that surface is typically either full-screen or half-screen. Microsoft solved this incongruity by shoving the entire Desktop inside of another app that could be properly split-screened against the new, better-behaved tablet apps.

If Apple were to decide "ok let's support Mac apps on iPad", it'd have to be done in exactly the same way Windows 8 did it, with a special Desktop app that contained all the Mac apps in a penalty box. This is so that they didn't have to add support for all sorts of incongruous, touch-hostile UI like floating toolbars, floating pop-ups, global menus, five different ways of dragging-and-dropping tabs, and that weird drawer thing you're not supposed to use anymore, to iPadOS. There really isn't a way to gradually do this, either. You can gradually add feature parity with macOS (which they should), but you can't gradually find ways to make desktop UI designed by third-parties work on a tablet. You either put it in a penalty box, or you put all the well-behaved tablet apps in their own penalty boxes, like Windows 10.

Microsoft solved Windows 8's problems by going back to the Windows XP/Vista/7 approach of just shipping a desktop for fingers. Tablet Mode tries to hide this, but it's fundamentally just window management automation, and it has to handle all the craziness of desktop. If a desktop app decides it wants a floating toolbar or a window that can't be resized[4], Tablet Mode has to honor that request. In fact, Tablet Mode needs a lot of heuristics to tell what floating windows pair with which apps. So it's a lot more awkward for tablet users in exchange for desktop users having a usable desktop again.

[0] Given what I've heard about Jobs I don't think Jobs was psychologically capable of having friends, but I'll use the word out of convenience.

[1] Though the Safari team was way better at building compatibility with existing websites, so much so that this is the one platform that doesn't have a deep mobile/desktop split.

[2] This was later extended to multiple windows per app, of course.

[3] This is also why popovers and context menus never extend outside their containing window on tablets. Hell, also on websites. Even when you have multiwindow, there's no API surface for "I want to have a control floating on top of my window that is positioned over here and has this width and height".

[4] Which, BTW, is why the iPad has no default calculator app. Before Stage Manager there was no way to have a window the size of a pocket calculator.

pram
0 replies
3h22m

Clip Studio is one Mac app port I’ve seen that was literally the desktop version moved to the iPad. It uniquely has the top menu bar and everything. They might have made an exception because you’re intended to use the pencil and not your fingers.

Bluecobra
0 replies
2h9m

Honestly, using a stylus isn't that bad. I've had to support floor traders for many years and they all still use a Windows-based tablet + a stylus to get around. Heck, even Palm devices were a pleasure to use. Not sure why Steve was so hell bent against them, it probably had to do with his beef with Sculley/Newton.

bluescrn
2 replies
4h28m

At this point, there's two fundamentally different types of computing that will likely never be mergeable in a satisfactory way.

We now have 'content consumption platforms' and 'content creation platforms'.

While attempts have been made to try and enable some creation on locked-down touchscreen devices, you're never going to want to try and operate a fully-featured version of Photoshop, Maya, Visual Studio, etc on them. And if you've got a serious workstation with multiple large monitors and precision input devices, you don't want to have dumbed-down touch-centric apps forced upon you Win8-style.

The bleak future that seems likely is that the 'content creation platforms' become ever more niche and far more costly. Barriers to entry for content creators are raised significantly as mainstream computing is mostly limited to locked-down content consumption platforms. And Linux is only an option for as long as non-locked-down hardware is available for sensible prices.

eastbound
1 replies
53m

On the other hand, a $4000 mid-game Macbook doesn’t have a touchscreen and that’s a heresy. Granted, you can get the one with the emoji bar, but why interact using touch on a bar when you could touch the screen directly?

Maybe the end game for Apple isn’t the full convergence, but just having a touch screen on the Mac.

bluescrn
0 replies
30m

Why would you want greasy finger marks on your Macbook screen?

Not much point having a touchscreen on a Macbook (or any laptop really), unless the hardware has a 'tablet mode' with a detachable or fold-away keyboard.

zitterbewegung
0 replies
5h12m

People have complained about why Logic Pro / Final Cut wasn't ported to the iPad Pro line. The obvious answer is that making workflows done properly take time.

mvkel
0 replies
4h40m

I'd be very surprised if Apple is paying attention to anything that's happening with windows. At least as a divining rod for how to execute.

a_vanderbilt
0 replies
4h56m

Even with the advantage of time, I don't think Microsoft would have been able to do it. They can't even get their own UI situated, much less adaptive. Windows 10/11 is this odd mishmash of old and new, without a consistent language across it. They can't unify what isn't even cohesive in the first place.

skohan
3 replies
4h41m

Unfortunately I think "they share all the same apps" will not include a terminal with root access, which is what would really be needed to make iPad a general purpose computer for development

It's a shame, because it's definitely powerful enough, and the idea of traveling with just an iPad seems super interesting, but I imagine they will not extend those features to any devices besides macs

LordDragonfang
2 replies
4h11m

I mean, it doesn't even have to be true "root" access. Chromebooks have a containerized linux environment, and aside from the odd bug, the high end ones are actually great dev machines while retaining the "You spend most of your time in the browser so we may as well bake that into the OS" base layer.

presides
0 replies
55m

Been a while since I've used a chromebook but iirc there's ALSO root access that's just a bit more difficult to access, and you do actually need to access it from time to time for various reasons, or at least you used to.

a_vanderbilt
0 replies
2h0m

I actually do use a Chromebook in this way! Out of all the Linux machines I've used, that's why I like it. Give me a space to work and provide an OS that I don't have to babysit or mentally maintain.

reaperducer
0 replies
2h42m

Maybe not a "it runs macOS", but an eventual "they share all the same apps" with adaptive UIs

M-class MacBooks can already run many iPhone and iPad apps.

plussed_reader
0 replies
4h5m

The writing was on the wall with the introduction of Swift, IMO. Since then it's been over complicating the iPad and dumbing down the macOS interfaces to attain this goal. So much wasted touch/negative space in macOS since Catalina to compensate for fingers and adapative interfaces; so many hidden menus and long taps squirreled away in iOS.

bonestamp2
0 replies
4h21m

Agreed, this will be the way forward in the future. I've already seen one of my apps (Authy) say "We're no longer building a macOS version, just install the iPad app on your mac".

That's great, but you need an M series chip in your mac for that to work so backwords compatibility only goes back a few years at this point, which is fine for corporate upgrade cycles but might be a bit short for consumers at this time. But it will be fine in the future.

beeboobaa3
0 replies
4h13m

Until an "iPhone" can run brew, all my developer tools, steam, epic games launcher, etc it's hardly interesting.

asabla
0 replies
5h48m

I think so too. Especially after the split from iOS to ipados. Hopefully they'll show something during this year's WWDC

0x457
0 replies
4h37m

I will settle for: you can connect 2 monitors to iPad and select audio device sound is going through. If can run IntelliJ and compile rust on the iPad, I would promise to upgrade to the new iPad Pro as soon as it is released every time.

criddell
0 replies
4h15m

And here it is in an OS that can't even max out an M1

Do you really want your OS using 100% of CPU?

philistine
6 replies
5h47m

Actually, TSMC's N3E process is somewhat of a regression on the first-generation 3nm process, N3. However, it is simpler and more cost-efficient, and everyone seems to want to get out of that N3 process as quickly as possible. That seems to be the biggest reason Apple released the A17(M3) generation and now the M4 the way they did.

The N3 process is in the A17 Pro, the M3, M3 Pro, and M3 Max. The A17 Pro name seems to imply you won't find it trickle down on the regular iPhones next year. So we'll see that processor only this year in phones, since Apple discontinues their Pro range of phones every year; only the regular phones trickle downrange lowering their prices. The M3 devices are all Macs that needed an upgrade due to their popularity: the Macbook Pro and Macbook Air. They made three chips for them, but they did not make an M3 Ultra for the lower volume desktops. With the announcement of an M4 chip in iPads today, we can expect to see the Macbook Air and Macbook Pro upgraded to M4 soon, with the introduction of an M4 Ultra to match later. We can now expect those M3 devices to be discontinued instead of going downrange in price.

That would leave one device with an N3 process chip: the iMac. At its sale level, I wouldn't be surprised if all the M3 chips that will go into it will be made this year, with the model staying around for a year or two running on fumes.

dhx
2 replies
5h15m

N3E still has a +9% logic transistor density increase on N3 despite a relaxation to design rules, for reasons such as introduction of FinFlex.[1] Critically though, SRAM cell sizes remain the same as N5 (reversing the ~5% reduction in N3), and it looks like the situation with SRAM cell sizes won't be improving soon.[2][3] It appears more likely that designers particularly for AI chips will just stick with N5 as their designs are increasingly constrained by SRAM.

[1] https://semiwiki.com/semiconductor-manufacturers/tsmc/322688...

[2] https://semiengineering.com/sram-scaling-issues-and-what-com...

[3] https://semiengineering.com/sram-in-ai-the-future-of-memory/

sroussey
1 replies
4h22m

SRAM has really stalled. I don’t think 5nm was much better than 7nm. On ever smaller nodes, sram will be taking up a larger and larger percent of the entire chip. But the cost is much higher on the smaller nodes even if the performance is not better.

I can see why AMD started putting the SRAM on top.

magicalhippo
0 replies
3h51m

It wasn't immediately clear to me why SRAM wouldn't scale like logic. This[1] article and this[2] paper sheds some light.

From what I can gather the key aspects are that decreased feature sizes lead to more variability between transistors, but also to less margin between on-state and off-state. Thus a kind of double-whammy. In logic circuits you're constantly overwriting with new values regardless of what was already there, so they're not as sensitive to this, while the entire point of a memory circuit is to reliably keep values around.

Alternate transistor designs such as FinFET, Gate-all-around and such can provide mitigation of some of this, say by reducing transistor-to-transistor variability by a factor, but can't get around root issue.

[1]: https://semiengineering.com/sram-scaling-issues-and-what-com...

[2]: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9416021/

GeekyBear
2 replies
5h23m

The signs certainly all point to the initial version of N3 having issues.

For instance, Apple supposedly required a deal where they only paid TSMC for usable chips per N3 wafer, and not for the entire wafer.

https://arstechnica.com/gadgets/2023/08/report-apple-is-savi...

dehrmann
1 replies
4h43m

My read on the absurd number of Macbook M3 SKUs was that they had yield issues.

GeekyBear
0 replies
4h3m

There is also the fact that we currently have an iPhone generation where only the Pro models got updated to chips on TSMC 3nm.

The next iPhone generation is said to be a return to form with all models using the same SOC on the revised version of the 3nm node.

Code from the operating system also indicates that the entire iPhone 16 range will use a new system-on-chip – t8140 – Tahiti, which is what Apple calls the A18 chip internally. The A18 chip is referenced in relation to the base model iPhone 16 and 16 Plus (known collectively as D4y within Apple) as well as the iPhone 16 Pro and 16 Pro Max (referred to as D9x internally)

https://www.macrumors.com/2023/12/20/ios-18-code-four-new-ip...

bitwize
5 replies
6h9m

Ultimately, does it matter?

Michelin-starred restaurants not only have top-tier chefs. They have buyers who negotiate with food suppliers to get the best ingredients they can at the lowest prices they can. Having a preferential relationship with a good supplier is as important to the food quality and the health of the business as having a good chef to prepare the dishes.

Apple has top-tier engineering talent but they are also able to negotiate preferential relationships with their suppliers, and it's both those things that make Apple a phenomenal tech company.

makeitdouble
4 replies
5h40m

Qualcomm is also with TSMC and their newer 4nm processor is expected to stay competitive with the M series.

If the magic comes mostly from TSMC, there's a good chance for these claims to be true and to have a series of better chips coming on the other platforms as well.

0x457
1 replies
4h32m

Does Qualcomm have any new CPU cores besides that one they can't make ARM due to licensing?

stouset
0 replies
43m

“Stay” competitive implies they’ve been competitive. Which they haven’t.

I’m filing this into the bin with all the other “This next Qualcomm chip will close the performance gap” claims made over the past decade. Maybe this time it’ll be true. I wouldn’t bet on it.

hot_gril
0 replies
5h9m

This info is much more useful than a comparison to restaurants.

halgir
0 replies
4h35m

If they maintain that pace, it will start compounding incredibly quickly. If we round to 2 years vs 2.5 years, after just a decade you're an entire doubling ahead.

luyu_wu
0 replies
3h28m

Note that performance per watt is 2x higher at both chips peak performance. This is in many ways an unfair comparison for Apple to make.

tuckerpo
0 replies
4h44m

It is almost certainly half as much power in the RMS sense, not absolute.

smrtinsert
0 replies
4h31m

Breathtaking

nblgbg
0 replies
4h16m

That doesn't seem to reflect in the battery life of these. They have the same exact battery life. Does it mean it's not entirely accurate? Since they don't indicate the battery capacity in their specs, it's hard to confirm this.

mensetmanusman
0 replies
4h59m

Also the thousands of suppliers that have improved their equipment and supplies that feed into the tsmc fabs.

beeboobaa3
0 replies
4h17m

And compared with the latest PC chip in a thin and light laptop, M4 can deliver the same performance using just a fourth of the power

It can deliver the same performance as itself at just a fourth of the power than it's using? That's incredible!

GeekyBear
0 replies
6h5m

They don't mention which metric is 50% higher.

However, we have more CPU cores, a newer core design, and a newer process node which would all contribute to improving multicore CPU performance.

Also, Apple is conservative on clock speeds, but those do tend to get bumped up when there is a new process node as well.

2OEH8eoCRo0
0 replies
6h37m

I wonder how much of that is Apple engineering and how much is TSMC improving their 3nm process.

I think Apple's design choices had a huge impact on the M1's performance but from there on out I think it's mostly due to TSMC.

rsp1984
157 replies
3h35m

Together with next-generation ML accelerators in the CPU, the high-performance GPU, and higher-bandwidth unified memory, the Neural Engine makes M4 an outrageously powerful chip for AI.

In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).

Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.

If (and that's a big if) they keep their APIs open to run any kind of AI workload on their chips it's a strategy that I personally really really welcome as I don't want the AI future to be centralised in the hands of a few powerful cloud providers.

andsoitis
55 replies
2h33m

In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).

Their primary business goal is to sell hardware. Yes, they’ve diversified into services and being a shopping mall for all, but it is about selling luxury hardware.

The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.

bamboozled
23 replies
2h24m

As soon as the privacy thing goes away, I'd say a major part of their customer base goes away too. Most people use android so they don't get "hacked" if Apple is doing the hacking, I'd just buy a cheaper alternative.

Draiken
14 replies
2h16m

At least here in Brazil, I've never heard such arguments.

Seems even more unlikely for non technical users.

It's just their latest market campaign, as far as I can tell. The vast majority of people buy iPhones because of the status it gives.

elzbardico
5 replies
2h2m

This is a prejudiced take. Running AI tasks locally on the device definitely is a giant improvement for the user experience.

But not only that, Apple CPUs are objectively leagues ahead of their competition in the mobile space. I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance. Because even a 4 years old IPhone still has specs that don't lag behind by much the equivalent Android phones, I still receive the latest OS updates, and because frankly, Android OS is mess.

If I cared about status, I would have changed my phone already for a new one.

kernal
2 replies
1h46m

Apple CPUs are objectively leagues ahead of their competition in the mobile space

This is a lie. The latest Android SoCs are just as powerful as the A series.

Because even a 4 years old IPhone still has specs that don't lag behind by much the equivalent Android phones, I still receive the latest OS updates, and because frankly, Android OS is mess.

Samsung and Google offer 7 years of OS and security updates. I believe that beats the Apple policy.

martimarkov
0 replies
1h24m

Strangle Android 14 seems to not be available for s20 phone which was released in 2020?

Or am I mistaken here?

Jtsummers
0 replies
41m

Samsung and Google offer 7 years of OS and security updates. I believe that beats the Apple policy.

On the second part:

https://en.wikipedia.org/wiki/IPadOS_version_history

The last iPads to stop getting OS updates (including security, to be consistent with what Samsung and Google are pledging) got 7 and 9 years of updates each (5th gen iPad and 1st gen iPad Pro). The last iPhones to lose support got about 7 years each (iPhone 8 and X). 6S, SE (1st), and 7 got 9 and 8 years of OS support with security updates. The 5S (released in 2013) last got a security update in early 2023, so also about 9 years, the 6 (2014) ended at the same time so let's call it 8 years. The 4S, 2011, got 8 years of OS support. 5 and 5C got 7 and 6 years of support (5C was 5 in a new case, so was always going to get a year less in support).

Apple has not, that I've seen at least, ever established a long term support policy on iPhones and iPads, but the numbers show they're doing at least as well as what Samsung and Google are promising to do, but have not yet done. And they've been doing this for more than a decade now.

EDIT:

Reworked the iOS numbers a bit, down to the month (I was looking at years above and rounding, so this is more accurate). iOS support time by device for devices that cannot use the current iOS 17 (so the XS and above are not counted here) in months:

  1st - 32
  3G  - 37
  3GS - 56
  4   - 48
  4S  - 93
  5   - 81
  5C  - 69
  5S  - 112
  6   - 100
  6S  - 102
  SE  - 96
  7   - 90
  8   - 78
  X   - 76
The average is 72.5 months, just over 6 years. If we knock out the first 2 phones (both have somewhat justifiable short support periods, massive hardware changes between each and their successor) the average jumps to just shy of 79 months, or about 6.5 years.

The 8 and X look like regressions, but their last updates were just 2 months ago (March 21, 2024) so still a good chance their support period will increase and exceed the 7 year mark like every model since the 5S. We'll have to see if they get any more updates in November 2024 or later to see if they can hit the 7 year mark.

patall
1 replies
1h44m

I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance.

My Pixel 4a here is also going strong, only the battery is slowly getting worse. I mean, it's 2024, do phones really still get slow? The 4a is now past android updates, but that was promised after 3 years. But at 350 bucks, it was like 40% less than the cheapest iPhone mini at that time.

onemoresoop
0 replies
1h8m

I mean, it's 2024, do phones really still get slow?

Hardware is pretty beefed up but bloat keeps on growing, that is slowing things down considerably.

dijit
4 replies
2h0m

I never understood this argument.

Theres no “status” to a brand of phone when the cheapest point of entry is comparable and the flagship is cheaper than the alternative flagship.

Marketing in most of europe is chiefly not the same as the US though so maybe its a perspective thing.

I just find it hard to really argue “status” when the last 4 iPhone generations are largely the same and cheaper than the Samsung flagships.

At Elgiganten a Samsung S24 Ultra is 19,490 SEK[0].

The most expensive iPhone 15 pro max is 18,784 SEK at the same store[1].

[0]: https://nya.elgiganten.se/product/mobiler-tablets-smartklock...

[1]: https://nya.elgiganten.se/product/mobiler-tablets-smartklock...

Draiken
1 replies
1h17m

My take is that it's like a fashion accessory. People buy Gucci for the brand, not the material or comfort.

Rich people ask for the latest most expensive iPhone even if they're only going to use WhatsApp and Instagram on it. It's not because of privacy or functionality, it's simply to show off to everyone they can purchase it. Also to not stand out within their peers as the only one without it.

As another content said: it's not an argument, it's a fact here.

Aerbil313
0 replies
42m

I have an iPhone so I guess I qualify as a rich person by your definition. I am also a software engineer. I cannot state enough how bogus that statement is. I've used both iPhone and Android, and recent flagships. iPhone is by far the easiest one to use. Speaking in more objective terms, iPhones have a coherent UI which maintains its consistency both throughout the OS and over the years. They're the most dumbed down phones and easiest to understand. I recommend iPhone to all my friends and relatives.

There's obviously tons of people who see iPhone as a status item. They're right, because iPhone is expensive and only the rich can buy them. This doesn't mean iPhone is not the best option out there for a person who doesn't want to extensively customize his phone and just use it.

pompino
0 replies
1h31m

Its not an argument, just ask why people lust after the latest iPhones in poor countries. They do it because they see rich people owning them. Unless you experience that, you won't really understand it.

hindsightbias
0 replies
31m

It's fashion and the kids are hip. But there is an endless void of Apple haters here who want to see it burn. They have nothing in common with 99.9% of the customer base.

briandear
1 replies
1h53m

The vast majority of people don’t. They buy because the ecosystem works. Not sure how I get status from a phone that nobody knows I have. I don’t wear it on a chain.

Draiken
0 replies
1h13m

Could it possibly be different in Brazil?

iPhones are not ubiquitous here, and they're way more expensive than other options.

everly
0 replies
2h9m

They famously had a standoff with the US gov't over the Secure Enclave.

Marketing aside, all indications point to the iOS platform being the most secure mobile option (imo).

jamesmontalvo3
4 replies
2h15m

Maybe true for a lot of the HN population, but my teenagers are mortified by the idea of me giving them android phones because then they would be the pariahs turning group messages from blue to green.

WheatMillington
1 replies
1h54m

This is a sad state of affairs.

adamomada
0 replies
1h36m

Interesting that some people would take that as an Apple problem and others would take it as a Google problem

Who’s at fault for not having built-in messaging that works with rich text, photos, videos, etc?

Google has abandoned more messaging products than I can remember while Apple focused on literally the main function of a phone in the 21st century. And they get shit for it

simonh
0 replies
1h2m

I’m in Europe and everyone uses WhatsApp, and while Android does gave higher share over here, iPhone still dominate the younger demographics. I’m not denying blue/green is a factor in the US but it’s not even a thing here. It’s nowhere near the only it even a dominant reason iPhones are successful with young people.

adamc
0 replies
22m

Snobbery is an expensive pastime.

littlestymaar
2 replies
2h12m

Apple only pivoted into the “privacy” branding relatively recently [1] and I don't think that many people came for that reason alone. In any case, most are now trapped into the walled garden and the effort to escape is likely big enough. And there's no escape anyway, since Google will always make Android worse in that regard…

[1] in 2013 they even marketed their “eBeacon” technology as a way for retail stores to monitor and track their customers which…

adamomada
1 replies
1h29m

Ca 2013 was the release of the Nexus 5, arguably the first really usable android smartphone.

Privacy wasn’t really a concern because most people didn’t have the privacy eroding device yet. In the years following the Nexus 5 is where smartphones went into geometric growth and the slow realization of the privacy nightmare became apparent

Imho I was really excited to get a Nexus 4 at the time, just a few short years later the shine wore off and I was horrified at the smartphone enabled future. And I have a 40 year background in computers and understand them better than 99 out of 100 users – if I didn’t see it, I can’t blame them either

mkl
0 replies
1h20m

Ca 2013 was the release of the Nexus 5, arguably the first really usable android smartphone.

What a strange statement. I was late to the game with a Nexus S in 2010, and it was really usable.

serial_dev
19 replies
2h19m

It doesn't need to stay true forever.

The alternative is Google / Android devices and OpenAI wrapper apps, both of which usually offer a half baked UI, poor privacy practices, and a completely broken UX when the internet connection isn't perfect.

Pair this with the completely subpar Android apps, Google dropping support for an app about once a month, and suddenly I'm okay with the lesser of two evils.

I know they aren't running a charity, I even hypothesized that Apple just can't build good services so they pivoted to focusing on this fake "privacy" angle. In the end, iPhones are likely going to be better for edge AI than whatever is out there, so I'm looking forward to this.

jocaal
7 replies
2h6m

better for edge AI than whatever is out there, so I'm looking forward to this

What exactly are you expecting? The current hype for AI is large language models. The word 'large' has a certain meaning in that context. Much larger that can fit on your phone. Everyone is going crazy about edge AI, what am I missing?

jitl
3 replies
1h57m

Quantized LLMs can run on a phone, like Gemini Nano or OpenLLAMA 3B. If a small local model can handle simple stuff and delegate to a model in the data center for harder tasks and with better connectivity you could get an even better experience.

SmellTheGlove
2 replies
1h53m

If a small local model can handle simple stuff and delegate to a model in the data center for harder tasks and with better connectivity you could get an even better experience.

Distributed mixture of experts sounds like an idea. Is anyone doing that?

cheschire
1 replies
1h4m

Sounds like an attack vector waiting to happen if you deploy enough competing expert devices into a crowd.

I’m imagining a lot of these LLM products on phones will be used for live translation. Imagine a large crowd event of folks utilizing live AI translation services being told completely false translations because an actor deployed a 51% attack.

jagger27
0 replies
3m

I’m not particularly scared of a 51% attack between the devices attached to my Apple ID. If my iPhone splits inference work with my idle MacBook, Apple TV, and iPad, what’s the problem there?

jchanimal
0 replies
2h0m

It fits on your phone, and your phone can offload battery burning tasks to nearby edge servers. Seems like the path consumer-facing AI will take.

gopher_space
0 replies
54m

Everyone is going crazy about edge AI, what am I missing?

If you clone a model and then bake in a more expensive model's correct/appropriate responses to your queries, you now have the functionality of the expensive model in your clone. For your specific use case.

The size of the resulting case-specific models are small enough to run on all kinds of hardware, so everyone's seeing how much work can be done on their laptop right now. One incentive for doing so is that your approaches to problems are constrained by the cost and security of the Q&A roundtrip.

callalex
0 replies
1h51m

In the hardware world, last year’s large has a way of becoming next year’s small. For a particularly funny example of this, check out the various letter soup names that people keep applying to screen resolutions. https://en.m.wikipedia.org/wiki/Display_resolution_standards...

rfoo
6 replies
1h52m

The alternative is Google / Android devices

No, the alternative is Android devices with everything except firmware built from source and signed by myself. And at the same time, being secure, too.

You just can't have this on Apple devices. On Android side choices are limited too, I don't like Google and especially their disastrous hardware design, but their Pixel line is the most approachable one able to do all these.

Heck, you can't even build your own app for your own iPhone without buying another hardware (a Mac, this is not a software issue, this is a legal issue, iOS SDK is licensed to you on the condition of using on Apple hardware only) and a yearly subscription. How is this acceptable at all?

simfree
1 replies
1h45m

WebGPU and many other features on iOS are unimplemented or implemented in half-assed or downright broken ways.

These features work on all the modern desktop browsers and on Android tho!

Aloisius
0 replies
40m

WebGPU and many other features

WebGPU isn't standardized yet. Hell, most of the features people complain about aren't part of any standard, but for some reason there's this sense that if it's in Chrome, it's standard - as if Google dictates standards.

ToucanLoucan
1 replies
1h9m

Heck, you can't even build your own app for your own iPhone without buying another hardware (a Mac, this is not a software issue, this is a legal issue, iOS SDK is licensed to you on the condition of using on Apple hardware only) and a yearly subscription. How is this acceptable at all?

Because they set the terms of use of the SDK? You're not required to use it. You aren't required to develop for iOS. Just because Google gives it all away for free doesn't mean Apple has to.

ClumsyPilot
0 replies
4m

You aren't required to develop for iOS

Do you have a legal right to write software or run your own software for hardware you bought?

Because it’s very easy to take away a right by erecting aritificial barriers, just like how you could discriminate by race at work, but pretend you are doing something else,

nrb
0 replies
55m

How is this acceptable at all?

Because as you described, the only alternatives that exist are terrible experiences for basically everyone, so people are happy to pay to license a solution that solves their problems with minimal fuss.

Any number of people could respond to “use Android devices with everything except firmware built from source and signed by myself” with the same question.

adamomada
0 replies
1h45m

The yearly subscription is for publishing your app on Apple’s store and definitely helps keep some garbage out. Running your own app on your own device is basically solved with free third party solutions now (see AltStore and since a newer method I can’t recall atm)

kernal
2 replies
1h58m

subpar Android apps

Care to cite these subpar Android apps? The app store is filled to the brim with subpar and garbage apps.

Google dropping support for an app about once a month

I mean if you're going to lie why not go bigger

I'm okay with the lesser of two evils.

So the more evil company is the one that pulled out of China because they refused to hand over their users data to the Chinese government on a fiber optic silver plate?

martimarkov
1 replies
1h32m

Google operates in China albeit via their HK domain.

They also had project DragonFly if you remember.

The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them.

Google is famously known to kill apps that are good and used by customers: https://killedbygoogle.com/

As for the subpar apps: there is a massive difference between the network traffic when on the Home Screen between iOS and Android.

kernal
0 replies
11m

Google operates in China albeit via their HK domain.

The Chinese government has access to the iCloud account of every Chinese Apple user.

They also had project DragonFly if you remember.

Which never materialized.

The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them.

Apple does targeted and non targeted advertising as well. Additionally, your carrier has likely sold all of the data they have on you. Apple was also sued for selling user data to ad networks. Odd for a Privacy First company to engage in things like that.

Google is famously known to kill apps that are good and used by customers: https://killedbygoogle.com/

Google has been around for 26 years I believe. According to that link 60 apps were killed in that timeframe. According to your statement that Google kills an app a month that would leave you 252 apps short. Furthermore, the numbers would indicate that Google has killed 2.3 apps per year or .192 apps per month.

As for the subpar apps: there is a massive difference between the network traffic when on the Home Screen between iOS and Android.

Not sure how that has anything to do with app quality, but if network traffic is your concern there's probably a lot more an Android user can do than an iOS user to control or eliminate the traffic.

cbsmith
0 replies
1h30m

Google has also been working on (and provides kits for) local machine learning on mobile devices... and they run on both iOS and Android. The Gemini App does send data in to Google for learning, but even that you can opt out of.

Apple's definitely pulling a "Heinz" move with privacy, and it is true that they're doing a better job of it overall, but Google's not completely horrible either.

nox101
4 replies
2h0m

Their primary business is transitioning to selling services and extracting fees. It's their primary growth

brookst
2 replies
1h53m

Hey, I'm way ahead of Apple. I sell my services to my employer and extract fees from them. Do you extract fees too?

nox101
1 replies
1h50m

I'm not sure what you're point is. My point (which I failed at), is that Apple's incentives are changing because their growth is dependent on services and extracting fees so they will likely do things that try to make people dependent on those services and find more ways to charge fees (to users and developers).

Providing services is arguably at odds with privacy since a service with access to all the data can provide a better service than one without so there will be a tension between trying to provide the best services, fueling their growth, and privacy.

brookst
0 replies
1h41m

I apologize for being oblique and kind of snarky.

My point was that it's interesting how we can frame a service business "extracting fees" to imply wrongdoing. When it's pretty normal for all services to charge ongoing fees for ongoing delivery.

adamomada
0 replies
1h41m

So the new iPad & M4 was just some weekend project that they shrugged and decided to toss over to their physical retail store locations to see if anyone still bought physical goods eh

klabb3
1 replies
52m

but it is about selling luxury hardware.

Somewhat true but things are changing. While there are plenty of “luxury” Apple devices like Vision Pro or fully decked out MacBooks for web browsing we no longer live in a world where tech are just lifestyle gadgets. People spend hours a day on their phones, and often run their life and businesses through it. Even with the $1000+/2-3y price tag, it’s simply not that much given how central role it serves in your life. This is especially true for younger generations who often don't have laptops or desktops at home, and also increasingly in poorer-but-not-poor countries (say eg Eastern Europe). So the iPhone (their best selling product) is far, far, far more a commodity utility than typical luxury consumption like watches, purses, sports cars etc.

Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury. Especially since the M1 launched, where performance and battery life took a giant leap.

_the_inflator
0 replies
12m

I disagree.

Apple is selling hardware and scaling AI by utilizing it is simply a smart move.

Instead of building huge GPU clusters, having to deal with NVIDIA for GOUs (Apple kicked NVIDIA out years ago because of disagreements), Apple is building mainly on existing hardware.

This is in other terms utilizing CPU power.

On the other hand this helps their marketing keeping high price points when Apple now is going to differentiate their COU power and therefore hardware prices over AI functionality correlating with CPU power. This is also consistent with Apple stopping the MHz comparisons years ago.

stouset
0 replies
52m

Nothing is true forever. Google wasn’t evil forever, Apple won’t value privacy forever.

Until we figure out how to have guarantees of forever, the best we can realistically do is evaluate companies and their products by their behavior now weighted by their behavior in the past.

moritzwarhier
0 replies
11m

> In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).

Their primary business goal is to sell hardware.

There is no contradiction here. No need for luxury. Efficient hardware scales, Moore's law has just been rewritten, not defeated.

Power efficiency combined with shared and extremely fast RAM, it is still a formula for success as long as they are able to deliver.

By the way, M-series MacBooks have crossed bargain territory by now compared to WinTel in some specific (but large) niches, e.g. the M2 Air.

They are still technically superior in power efficiency and still competitive in performance in many common uses, be it traditional media decoding and processing, GPU-heavy tasks (including AI) single-core performance...

By the way, this includes web technologies / JS.

kortilla
0 replies
32m

The MacBook Air is not a luxury device. That meme is out of date

hehdhdjehehegwv
0 replies
43m

As a privacy professional for many, many years this is 100% correct. Apple wouldn’t be taking billions from Google for driving users to their ad tracking system, they wouldn’t give the CCP access to all Chinese user data (and maybe beyond), and they wouldn’t be on-again-off-again flirting with tailored ads in Apple News if privacy was a “human right”.

(FWIW my opinion is it is a human right, I just think Tim Cook is full of shit.)

What Apple calls privacy more often than not is just putting lipstick on the pig that is their anticompetitive walled garden.

Pretty much everybody in SV who works in privacy rolls their eyes at Apple. They talk a big game but they are as full of shit as Meta and Google - and there’s receipts to prove it thanks to this DoJ case.

Apple want to sell high end hardware. On-device computation is a better user experience, hands down.

That said, Siri is utter dogshit so on-device dogshit is just faster dogshit.

joelthelion
24 replies
3h30m

n case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning)

I'm curious: is anyone seriously using apple hardware to train Ai models at the moment? Obviously not the big players, but I imagine it might be a viable option for Ai engineers in smaller, less ambitious companies.

andrewmcwatters
9 replies
3h22m

Yes, it can be more cost effective for smaller businesses to do all their work on Mac Studios, versus having a dedicated Nvidia rig plus Apple or Linux hardware for your workstation.

Honestly, you can train basic models just fine on M-Series Max MacBook Pros.

nightski
7 replies
3h15m

A decked out Mac Studio is like $7k for far less GPU power. I find that highly unlikely.

TylerE
2 replies
3h3m

A non-decked out Mac Studio is a hell of a machine for $1999.

Do you also compare cars by looking at only the super expensive limited editions, with every single option box ticked?

I'd also point out that said 3 year old $1999 Mac Studio that I'm typing this on already runs ML models usefully, maybe 40-50% of the old 3000-series Nvidia machine it replaces, while using literally less than 10% of the power and making a tiny tiny fraction of the noise.

Oh, and it was cheaper. And not running Windows.

bee_rider
1 replies
2h47m

They are talking about training models, though. Run is a bit ambiguous, is that also what you mean?

TylerE
0 replies
2h11m

No.

For training the Macs do have some interesting advantages due to the unified memory. The GPU cores have access to all of system RAM (and also the system RAM is ridiculously fast - 400GB/sec when DDR4 is barely 30GB/sec, which has a lot of little fringe benefits of it's own, part of why the Studio feels like an even more powerful machine than it actually is. It's just super snappy and responsive, even under heavy load.)

The largest consumer NVidia card has 22GB of useable RAM.

The $1999 Mac has 32GB, and for $400 more you get 64GB.

$3200 gets you 96GB, and more GPU cores. You can hit the system max of 192GB for $5500 on an Ultra, albeit it with the lessor GPU.

Even the recently announced 6000-series AI-oriented NVidia cards max out at 48GB.

My understanding is a that a lot of enthusiasts are using Macs for training because for certain things having more RAM is just enabling.

softfalcon
1 replies
3h6m

Don't attack me, I'm not disagreeing with you that an nVidia GPU is far superior at that price point.

I simply want to point out that these folks don't really care about that. They want a Mac for more reasons than "performance per watt/dollar" and if it's "good enough", they'll pay that Apple tax.

Yes, yes, I know, it's frustrating and they could get better Linux + GPU goodness with an nVidia PC running Ubuntu/Arch/Debian, but macOS is painless for the average science AI/ML training person to set up and work with. There are also known enterprise OS management solutions that business folks will happily sign off on.

Also, $7000 is chump change in the land of "can I get this AI/ML dev to just get to work on my GPT model I'm using to convince some VC's to give me $25-500 million?"

tldr; they're gonna buy a Mac cause it's a Mac and they want a Mac and their business uses Mac's. No amount of "but my nVidia GPU = better" is ever going to convince them otherwise as long as there is a "sort of" reasonable price point inside Apple's ecosystem.

brookst
0 replies
1h45m

What Linux setup do you recommend for 128GB of GPU memory?

inciampati
0 replies
3h9m

But you get access to a very large amount of RAM for that price.

andrewmcwatters
0 replies
2h36m

Not all of us who own small businesses are out here speccing AMD Ryzen 9s and RTX 4090s for workstations.

You can't lug around a desktop workstation.

skohan
0 replies
1h42m

a dedicated Nvidia rig

I am honestly shocked Nvidia has been allowed to maintain their moat with cuda. It seems like AMD would have a ton to gain just spending a couple million a year to implement all the relevant ML libraries with a non-cuda back-end.

alfalfasprout
5 replies
3h3m

Not really (I work on AI/ML Infrastructure at a well known tech company and talk regularly w/ our peer companies).

That said, inference on apple products is a different story. There's definitely interest in inference on the edge. So far though, nearly everyone is still opting for inference in the cloud for two reasons:

1. There's a lot of extra work involved in getting ML/AI models ready for mobile inference. And this work is different for iOS vs. Android 2. You're limited on which exact device models will run the thing optimally. Most of your customers won't necessarily have that. So you need some kind of fallback. 3. You're limited on what kind of models you can actually run. You have way more flexibility running inference in the cloud.

teaearlgraycold
2 replies
2h43m

Pytorch actually has surprisingly good support for Apple Silicon. Occasionally an operation needs to use CPU fallback but many applications are able to run inference entirely off of the CPU cores.

rcarmo
0 replies
2h22m

And there is a lot of work being done with mlx.

ein0p
0 replies
2h21m

I’ve found it to be pretty terrible compared to CUDA, especially with Huggingface transformers. There’s no technical reason why it has to be terrible there though. Apple should fix that.

throwitaway222
0 replies
1h41m

Inference on the edge is a lot like JS - just drop a crap ton of data to the front end, and let it render.

gopher_space
0 replies
17m

A cloud solution I looked at a few years ago could be replicated (poorly) in your browser today. In my mind the question has become one of determining when my model is useful enough to detach from the cloud, not whether that should happen.

dylan604
1 replies
2h36m

Does one need to train an AI model on specific hardware, or can a model be trained in one place and then used somewhere else? Seems like Apple could just run their fine tuned model called Siri on each device. Seems to me like asking for training on Apple devices is missing the strategy. Unless of course, it's just for purely scientific $reasons like "why install Doom on the toaster?" vs doing it for a purpose.

xanderlewis
0 replies
2h14m

It doesn’t require specific hardware; you can train a neural net with pencil and paper if you have enough time. Of course, some pieces of hardware are more efficient than others for this.

deanishe
1 replies
2h43m

Isn't Apple hardware too expensive to make that worthwhile?

brookst
0 replies
1h46m

For business-scale model work, sure.

But you can get an M2 Ultra with 192GB of UMA for $6k or so. It's very hard to get that much GPU memory at all, let alone at that price. Of course the GPU processing power is anemic compared to a DGX Station 100 cluster, but the mac is $143,000 less.

robbomacrae
0 replies
2h23m

I don't think this is what you meant but it matches the spec: federated learning is being used by Apple to train models for various applications and some of that happens on device (iphones/ipads) with your personal data before its hashed and sent up to the mothership model anonymously.

https://www.technologyreview.com/2019/12/11/131629/apple-ai-...

cafed00d
0 replies
2h41m

I like to think back to 2011 and paraphrase what people were saying: "Is anyone seriously using gpu hardware to write nl translation software at the moment?"

"No, we should be use cheap commodity abundantly available cpus and orchestrate then behind cloud magic to write our nl translation apps"

or maybe "no we should build purpose built high performance computing hardware to write our nl translation apps"

Or perhaps in the early 70s "is anyone seriously considering personal computer hardware to ...". "no, we should just buy IBM mainframes ..."

I don't know. Im probably super biased. I like the idea of all this training work breaking the shackles of cloud/mainframe/servers/off-end-user-device and migrating to run on peoples devices. It feels "democratic".

avianlyric
0 replies
2h19m

Apple are. Their “Personal Voice” feature fine tunes a voice model on device using recordings of your own voice.

An older example is the “Hey Siri” model, which is fine tuned to your specific voice.

But with regards to on device training, I don’t think anyone is seriously looking at training a model from scratch on device, that doesn’t make much sense. But taking models and fine tuning them to specific users makes a whole ton of sense, and an obvious approach to producing “personal” AI assistants.

[1] https://support.apple.com/en-us/104993

Q6T46nT668w6i3m
0 replies
3h10m

Yes, there’re a handful of apps that use the neural engine to fine tune models to their data.

dheera
11 replies
3h14m

This is completely coherent with their privacy-first strategy

Apple has never been privacy-first in practice. They give you the illusion of privacy but in reality it's a closed-source system and you are forced to trust Apple with your data.

They also make it a LOT harder than Android to execute your own MITM proxies to inspect what exact data is being sent about you by all of your apps including the OS itself.

deadmutex
5 replies
3h2m

Yeah, given that they resisted putting RCS in iMessage so long, I am a bit skeptical about the whole privacy narrative. Especially when Apple's profit is at odds with user privacy.

notaustinpowers
2 replies
2h47m

From my understanding, the reason RCS was delayed is because Google's RCS was E2EE only in certain cases (both users using RCS). But also because Google's RCS runs through Google servers.

If Apple enabled RCS in messages back then, but the recipient was not using RCS, then Google now has the decrypted text message, even when RCS advertises itself as E2EE. With iMessage, at least I know all of my messages are E2EE when I see a blue bubble.

Even now, RCS is available on Android if using Google Messages. Yes, it's pre-installed on all phones, but OEMs aren't required to use it as the default. It opens up more privacy concerns because now I don't know if my messages are secure. At least with the green bubbles, I can assume that anything I send is not encrypted. With RCS, I can't be certain unless I verify the messaging app the recipient is using and hope they don't replace it with something else that doesn't support RCS.

vel0city
1 replies
1h34m

You know what would really help Apple customers increase their privacy when communicating with non-Apple devices?

Having iMessage available to everyone regardless of their mobile OS.

notaustinpowers
0 replies
1h32m

Agreed. While I have concerns regarding RCS, Apple's refusal to make iMessage an open platform due to customer lock-in is ridiculous and anti-competitive.

fabrice_d
0 replies
2h41m

How is RCS a win on the privacy front? It's not even e2e encrypted in an interoperable way (Google implementation is proprietary).

acdha
0 replies
2h13m

RCS is a net loss for privacy: it gives the carriers visibility into your social graph and doesn’t support end to end encryption. Google’s PR campaign tried to give the impression that RCS supports E2EE but it’s restricted to their proprietary client.

ben_w
3 replies
2h55m

You say that like open source isn't also an illusion of trust.

The reality is, there's too much to verify, and not enough interest for the "many eyeballs make all bugs shallow" argument.

We are, all of us, forced to trust, forced to go without the genuine capacity to verify. It's not great, and the best we can do is look for incentives and try to keep those aligned.

dheera
1 replies
2h48m

I don't agree with relying on the many eyeballs argument for security, but from a privacy standpoint, I do think at least the availability of source to MY eyeballs, as well as the ability to modify, recompile, and deploy it, is better than "trust me bro I'm your uncle Steve Jobs and I know more about you than you but I'm a good guy".

If you want to, for example, compile a GPS-free version of Android that appears like it has GPS but in reality just sends fake coordinates to keep apps happy thinking they got actual permissions, it's fairly straightforward to make this edit, and you own the hardware so it's within your rights to do this.

Open-source is only part of it; in terms of privacy, being able to see what all is being sent in/out of my device is is arguably more important than open source. Closed source would be fine if they allowed me to easily inject my own root certificate for this purpose. If they aren't willing to do that, including a 1-click replacement of the certificates in various third-party, certificate-pinning apps that are themselves potential privacy risks, it's a fairly easy modification to any open source system.

A screen on my wall that flashes every JSON that gets sent out of hardware that I own should be my right.

ben_w
0 replies
2h36m

Open-source is only part of it; in terms of privacy, being able to see what all is being sent in/out of my device is is arguably more important than open source.

I agree; unfortunately it feels as if this ship has not only sailed, but the metaphor would have to be expanded to involve the port at well.

Is it even possible, these days, to have a functioning experience with no surprise network requests? I've tried to limit mine via an extensive hosts file list, but that did break stuff even a decade ago, and the latest version of MacOS doesn't seem to fully respect the hosts file (weirdly it partially respects it?)

A screen on my wall that flashes every JSON that gets sent out of hardware that I own should be my right.

I remember reading a tale about someone, I think it was a court case or an audit, who wanted every IP packet to be printed out on paper. Only backed down when the volume was given in articulated lorries per hour.

I sympathise, but you're reminding me of that.

ajuc
0 replies
2h8m

Open source is like democracy. Imperfect and easy to fuck up, but still by far the best thing available.

Apple is absolutism. Even the so called "enlightened" absolutism is still bad compared to average democracy.

wan23
0 replies
1h47m

Apple has never been privacy-first in practice > They also make it a LOT harder than Android to execute your own MITM proxies

I would think ease of MITM and privacy are opposing concerns

legitster
9 replies
3h21m

This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).

I feel like people are being a bit naïve here. Apple's "Privacy First" strategy was a marketing spin developed in response to being dead-last in web-development/cloud computing/smart features.

Apple has had no problem changing their standards by 180 degrees and being blatantly anti-consumer whenever they have a competitive advantage to do so.

seec
2 replies
3h13m

Don't bother the fanboys have an Apple can't do anything wrong/malicious. At this point it's closer to a religion than ever.

You would be amazed at the response of some of them when I point out some shit Apple does that make their products clearly lacking for the price, the cognitive dissonance is so strong they don't know how to react in any other way than lying or pretending it doesn't matter.

n9
0 replies
40m

Your comment is literally more subjective, dismissive, and full of FUD than any other on on this thread. Check yourself.

acdha
0 replies
2h16m

If you’re annoyed about quasi-religious behavior, consider that your comment has nothing quantifiable and contributed nothing to this thread other than letting us know that you don’t like Apple products for non-specific reasons. Maybe you could try to model the better behavior you want to see?

robbomacrae
2 replies
2h18m

Having worked at Apple I can assure you it's not just spin. It's nigh on impossible to get permission to even compare your data with another service inside of Apple and even if you do get permission the user ids and everything are completely different so theres no way to match up users. Honestly its kind of ridiculous the lengths they go to and makes development an absolute PITA.

legitster
0 replies
1h7m

That could very well be true, but I also think it could change faster than people realize. Or that Apple has the ability to compartmentalize (kind of like how Apple can advocate for USB C adoption in some areas and fight it in others).

I'm not saying this to trash Apple - I think it's true of any corporation. If Apple starts losing revenue in 5 years because their LLM isn't good enough because they don't have enough data, they are still going to take it and have some reason justifying why theirs is privacy focused and everyone else is not.

briandear
0 replies
1h41m

As an Apple alum, I can agree with everything you’ve said.

IggleSniggle
2 replies
3h10m

Of course! The difference is that, for the time being, my incentives are aligned with theirs in regards to preserving my privacy.

The future is always fungible. Anyone can break whatever trust they've built very quickly. But, like the post you are replying to, I have no qualms about supporting companies that are currently doing things in my interest and don't have any clear strategic incentive to violate that trust.

Edit: that same incentive structure would apply to NVIDIA, afaik

jajko
1 replies
2h35m

I can't agree with your comment. apple has all the incentives to monetize your data, that's the whole value of Google and Meta. And they are already heading into ad-business earning billions last I've checked. Hardware ain't selling as much as before, this isn't going to change for the better in foreseeable future.

The logic is exactly same as ie Meta claims - we will pseudoanonymize your data, so technically your specific privacy is just yours, see nothing changed. But you are in various target groups for ads, plus we know how 'good' those anon efforts are when money are at play and corporations are only there to earn as much money as possible. Rest is PR.

IggleSniggle
0 replies
2h20m

Persuasive, thank you

krunck
9 replies
3h30m

Yes, that would be great. But without the ability for us to verify this who's to say they won't use the edge resources(your computer and electricity) to process data(your data) and then send the results to their data center? It would certainly save them a lot of money.

IggleSniggle
3 replies
3h16m

When you can do all inference at the edge, you can keep it disconnected from the network if you don't trust the data handling.

I happen to think they wouldn't, simply because sending this data back to Apple in any form that they could digest it is not aligned with their current privacy-first strategies. But if they make a device that still works if it stays disconnected, the neat thing is that you can just...keep it disconnected. You don't have to trust them.

chem83
1 replies
2h54m

Except that's an unreasonable scenario for a smart phone. It doesn't prove that the minute the user goes online it won't be egressing data willingly or not.

IggleSniggle
0 replies
2h48m

I don't disagree, although when I composed my comment I had desktop/laptop in mind, as I think genuinely useful on-device smartphone-AI is a ways of yet, and who knows what company Apple will be by then.

bee_rider
0 replies
2h36m

To use a proprietary system and not trust the vendor, you have to never connect it. That’s possible of course, but it seems pretty limiting, right?

victorbjorklund
1 replies
2h14m

If you trust that Apple doesn't film you with the camera when you use the phone while sitting on the toilet. Why wouldn't you trust Apple now?

It would have to be a huge conspiracy with all Apples employees. And you can easily just listen to the network and see if they do it or not.

xanderlewis
0 replies
2h13m

I find it somewhat hard to believe that wouldn’t be in contravention of some law or other. Or am I wrong?

Of course we can then worry that companies are breaking the law, but you have to draw the line somewhere… and what have they to gain anyway?

robbomacrae
0 replies
2h20m

They already do this. It's called federated learning and its a way for them to use your data to help personalize the model for you and also (to a much lesser extent) the global model for everyone whilst still respecting your data privacy. It's not to save money, it's so they can keep your data private on device and still use ML.

https://www.technologyreview.com/2019/12/11/131629/apple-ai-...

chem83
0 replies
2h56m

+1 The idea that it's on device, hence it's privacy-preserving is Apple's marketing machine speaking and that doesn't fly anymore. They have to do better to convince any security and privacy expert worth their salt that their claims and guarantees can be independently verified on behalf of iOS users.

Google did some of that on Android, which means open-sourcing their on-device TEE implementation, publishing a paper about it etc.

astrange
0 replies
3h26m

You seem to be describing face recognition in Photos like it's a conspiracy against you. You'd prefer the data center servers looking at your data?

croes
6 replies
3h20m

It isn't privacy if Apple knows.

They are the gatekeeper of your data for their benefit not yours.

jajko
5 replies
2h59m

Yes at the end its just some data representing user's trained model. Is there a contractual agreement with users that apple will never ever transfer a single byte of those, otherwise huge penalties will happen? If not, its pinky PR promise that sounds nice.

threeseed
4 replies
2h39m

Apple publicly documents their privacy and security practices.

At minimum, laws around the world prevent companies from knowingly communicating false information to consumers.

And in many countries the rules around privacy are much more stringent.

croes
3 replies
2h29m

I bet Boeing also has documentation about their security practices.

Talk is cheap and in Apple's case it's part of their PR.

threeseed
0 replies
1h34m

But what does that have to do with the price of milk in Turkmenistan.

Because Boeing's issues have nothing to do with privacy or security and since they are not consumer facing have no relevance to what we are talking about.

dudeinjapan
0 replies
2h18m

What is wrong with Boeing's security?

Too many holes.

bamboozled
0 replies
2h23m

What is wrong with Boeing's security?

SpaceManNabs
4 replies
2h19m

This comment is odd. I wouldn't say it is misleading, but it is odd because it borders on such definition.

Apple's AI strategy is to put inference (and longer term even learning) on edge devices

This is pretty much everyone's strategy. Model distillation is huge because of this. This goes in line with federated learning. This goes in line with model pruning too. And parameter efficient tuning and fine tuning and prompt learning etc.

This is completely coherent with their privacy-first strategy

Apple's marketing for their current approach is privacy-first. They are not privacy first. If they were privacy first, you would not be able to use app tracking data on their first party ad platform. They shut it off for everyone else but themselves. Apple's approach is walled garden first.

Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity

as long as you don't depend on graph centric problems where keeping a local copy of that graph is prohibitive. Graph problems will become more common. Not sure if this is a problem for apple though. I am just commenting in general.

If (and that's a big if) they keep their APIs open to run any kind of AI workload on their chips

Apple does not have a good track record of this; they are quite antagonistic when it comes to this topic. Gaming on apple was dead for nearly a decade (and pretty much still is) because steve jobs did not want people gaming on macs. Apple has eased up on this, but it very much seems that if they want you to use their devices (not yours) in a certain way, then they make it expensive to do anything else.

Tbf, I don't blame apple for any of this. It is their strategy. Whether it works or not, it doesn't matter. I just found this comment really odd since it almost seemed like evangelism.

edit: weird to praise apple for on device training when it is not publicly known if they have trained any substantial model even on cloud.

nomel
1 replies
2h4m

This is pretty much everyone's strategy.

I think this is being too charitable on the state of "everyone". It's everyone's goal. Apple is actively achieving that goal, with their many year strategy of in house silicon/features.

SpaceManNabs
0 replies
22m

Apple is actively achieving that goal, with their many year strategy of in house silicon/features

So are other companies, with their many year strategy of actually building models that accessible to the public.

yet Apple is "actively" achieving the goal without any distinct models.

jameshart
1 replies
1h56m

Everyone’s strategy?

The biggest players in commercial AI models at the moment - OpenAI and Google - have made absolutely no noise about pushing inference to end user devices at all. Microsoft, Adobe, other players who are going big on embedding ML models into their products, are not pushing those models to the edge, they’re investing in cloud GPU.

Where are you picking up that this is everyone’s strategy?

SpaceManNabs
0 replies
1h48m

Where are you picking up that this is everyone’s strategy?

Read what their engineers say in public. Unless I hallucinated years of federated learning.

Also apple isn't even a player yet and everyone is discussing how they are moving stuff to the edge lol. Can't critique companies for not being on the edge yet when apple doesn't have anything out there.

kernal
2 replies
2h7m

This is completely coherent with their privacy-first strategy

How can they have a privacy first strategy when they operate an Ad network and have their Chinese data centers run by state controlled companies?

n9
0 replies
42m

... I think that the more correct assertion would be that Apple is a sector leader in privacy. If only because their competitors make no bones about violating the privacy of their customers as it is the basis of thier business model. So it's not that Apple is A+ so much as the other students are getting Ds and Fs.

KerrAvon
0 replies
1h50m

How can I have mint choc and pineapple swirl ice cream when there are children starving in Africa?

xipix
1 replies
1h31m

How is local more private? Whether AI runs on my phone or in a data center I still have to trust third parties to respect my data. That leaves only latency and connectivity as possible reasons to wish for endpoint AI.

chatmasta
0 replies
1h27m

If you can run AI in airplane mode, you are not trusting any third party, at least until you reconnect to the Internet. Even if the model was malware, it wouldn’t be able to exfiltrate any data prior to reconnecting.

You’re trusting the third party at training time, to build the model. But you’re not trusting it at inference time (or at least, you don’t have to, since you can airgap inference).

thefourthchime
1 replies
1h29m

Yes is it completely clear. My guess is they do something like "Siri-powered shortcuts". Where you can ask it to do a couple things and it'll dynamically create a script and execute it.

I can see a smaller model trained to do that may work well enough, however, I've never seen any real working examples of this work, that rabit device is heading in that direction, but it's mostly vaporware now.

_boffin_
0 replies
1h26m

Pretty much my thoughts too. Going to have a model that’s smaller than 3B built in. The’ll have tokens that directly represent functions / shortcuts.

strangescript
1 replies
2h4m

The fundamental problem with this strategy is model size. I want all my apps to be privacy first with local models, but there is no way they can share models in any kind of coherent way. Especially when good apps are going to fine tune their models. Every app is going to be 3GB+

tyho
0 replies
1h44m

Foundation models will be the new .so files.

s1k3s
1 replies
2h33m

For everyone else who doesn't understand what this means, he's saying Apple wants you to be able to run models on their devices, just like you've been doing on nvidia cards for a while.

nomel
0 replies
2h11m

I think he's saying they want to make local AI a first class, default, capability, which is very unlike buying a $1k peripheral to enable it. At this point (though everyone seems to be working on it), other companies need to include a gaming GPU in every laptop, and tablet now (lol), to enable this.

lunfard000
1 replies
2h41m

Prob beacuse they are like super-behind in the cloud space, it is not like they wouldn't like to sell the service. They ignored photos privacy quite a few times in the icloud.

dylan604
0 replies
2h34m

is it surprising since they effectively given the finger to data center hardware designs?

LtWorf
1 replies
1h5m

This is completely coherent with their privacy-first strategy

Is this the same apple whose devices do not work at all unless you register an apple account?

yazzku
0 replies
12m

Some people really seem to be truly delusional. It's obvious that their "privacy" is a marketing gimmick when you consider the facts. Do people not consider the facts anymore? How does somebody appeal to the company's "privacy-first strategy" with a straight face in light of the facts?

w1nst0nsm1th
0 replies
25m

to put inference on edge devices...

It will take a long time before you can put performant inference on edge device.

Just download one of the various open source large(st) langage model and test it on your desktop...

Compute power and memory and storage requirements are insane if you want decent result... I mean not just Llama gibberish.

Until such requirement are satisfied, distant model are the way to go, at least for conversational model.

Aside llm, AlphaGo would not run on any end user device, by a long shot, even if it is an already 'old' technology.

I think 'neural engine' on end user device is just marketing nonsense at this current state of the art.

tweetle_beetle
0 replies
24m

I wonder if BYOE (bring your own electricity) also plays a part in their long term vision? Data centres are expensive in terms of hardware, staffing and energy. Externalising this cost to customers saves money, but also helps to paint a green(washing) narrative. It's more meaningful to more people to say they've cut their energy consumption by x than to say they have a better server obselesence strategy, for example.

sergiotapia
0 replies
2h59m

privacy-first strategy

That's just their way of walled gardening apple customers. Then they can extort devs and other companies dry without any middle-men.

macns
0 replies
1h47m

This is completely coherent with their privacy-first strategy

You mean .. with their said privacy-first strategy

kshahkshah
0 replies
33m

Why is Siri still so terrible though?

jablongo
0 replies
2h35m

So for hardware accelerated training with something like PyTorch, does anyone have a good comparison between Metal vs Cuda, both in terms of performance and capabilities?

gtirloni
0 replies
39m

> complete independence of network connectivity and hence minimal latency.

Does it matter that each token takes additional milliseconds on the network if the local inference isn't fast? I don't think it does.

The privacy argument makes some sense, if there's no telemetry leaking data.

elorant
0 replies
1h21m

What other options do they have? Build hardware for the datacenter? They’ve avoided this for ages. Consumer hardware is their bread and butter and that’s where they’ll focus. The problem is though that even their high end mac studio ultra can’t hold a candle against an RTX A6000. So they’re trying to change the narrative. Here’s a new iPad with a powerful AI chip that blows the competition out of the water. Sure, fine. Let's wait and see if there is a need for something like that.

choppaface
0 replies
1h7m

On “privacy”: If Apple owned the Search app versus paying Google, and used their own ad network (which they have for App Store today), Apple will absolutely use your data and location etc to target you with ads.

It can even be third party services sending ad candidates directly to your phone and then the on-device AI chooses which is relevant.

Privacy is a contract not the absence of a clear business opportunity. Just look at how Apple does testing internally today. They have no more respect for human privacy than any of their competitors. They just differentiate through marketing and design.

aiauthoritydev
0 replies
44m

I think these days everyone links their products with AI. Today even BP CEO linked his business with AI. Edge inference and cloud inference are not mutually exclusive choices. Any serious provider will provide both and the improvement in quality of services come from you giving more of your data to the service provider. Most people are totally fine with that and that will not change any time sooner. Privacy paranoia is mostly a fringe thing in consumer tech.

aiauthoritydev
0 replies
48m

Edge inference and cloud inference are not mutually exclusive and chances are any serious player would be dipping their toes in both.

aborsy
0 replies
24m

What are the example of the edge devices made by Apple?

Powdering7082
0 replies
46m

Also they don't have to pay either the capex or opex costs for training a model if they get user's devices to train the models

MyFirstSass
0 replies
2h54m

I've been saying the same thing since ANE and the incredible new chips with shared ram, suddenly everyone could run capable local models - but then Apple decided to be catastrophically stingy once again putting ridiculous 8gb's of ram in these new iPads' and their new macbook air's destroying having a widespread "intelligent local siri" because now half the new generation can't run anything.

Apple is an amazing powerhouse but also disgustingly elitist and wasteful if not straight up vulgar in its profit motives. There's really zero idealism there despite their romantic and creative legacy.

There's always some straight idiotic limitations in their otherwise incredible machines, with no other purpose than to create planned obsolescence, "PRO" exclusivity and piles e-waste.

KaiserPro
0 replies
2h50m

This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).

I mean yeah, that makes good marketing copy, but its more due to reducing latency and keeping running costs down.

but as this is mostly marketing fluff we'll need to actually see how it performs before casting judgment on how "revolutionary" it is.

7speter
0 replies
47m

I personally really really welcome as I don't want the AI future to be centralised in the hands of a few powerful cloud providers.

Watch out for being able to using ai on your local machine and those ai services using telemetry to send your data (recorded conversations, for instance) to their motherships.

EduardoBautista
123 replies
6h46m

The 256gb and 512gb models have 8gb of ram. The 1tb and 2tb models have 16gb. Not a fan of tying ram to storage.

https://www.apple.com/ipad-pro/specs/

KeplerBoy
37 replies
6h11m

If these things will ever get MacOS support it will be useless with 8 GB of Ram.

Such a waste of nice components.

greggsy
23 replies
6h1m

This comes up frequently. 8GB is sufficient for most casual and light productivity use cases. Not everyone is a power user, in fact, most people aren’t.

regularfry
17 replies
5h54m

My dev laptop is an 8GB M1. It's fine. Mostly.

I can't run podman, slack, teams, and llama3-8B in llama.cpp at the same time. Oddly enough, this is rarely a problem.

prepend
6 replies
5h34m

It’s the “Mostly” part that sucks. What’s the price difference between 8 and 16? Like $3 in wholesale prices.

This just seems like lameness on Apple’s part.

Aurornis
3 replies
4h44m

What’s the price difference between 8 and 16? Like $3 in wholesale prices.

Your estimates are not even close. You can't honestly think that LPDDR5 at leading edge speeds is only $3 per 64 Gb (aka 8GB), right?

Your estimate is off my an order of magnitude. The memory Apple is using is closer to $40 for that increment, not $3.

And yes, they include a markup, because nobody is integrating hardware parts and selling them at cost. But if you think the fastest LPDDR5 around only costs $3 for 8GB, that's completely out of touch with reality.

moooo99
2 replies
4h8m

Even if taking raising market prices into account, your estimate for the RAM module price is waaaaaaay off.

You can get 8GB of good quality DDR5 DIMMs for 40$, there is no way in hell that Apple is paying anywhere near that.

Going from 8 to 16GBs is probably somewhere between 3-8$ purely in material costs for Apple, not taking into account any other costs associated

smarx007
1 replies
3h20m

GP said "LPDDR5" and that Apple won't sell at component prices.

You mention DIMMs and component prices instead. This is unhelpful.

See https://www.digikey.com/en/products/filter/memory/memory/774... for LPDDR5 prices. You can get a price of $48/chip at a volume of 2000 chips. Assuming that Apple got a deal of $30-40-ish at a few orders of magnitude larger order is quite fair. Though it certainly would be nicer if Apple priced 8GB increments not much above $80-120.

moooo99
0 replies
2h32m

I am aware that there are differences, I just took RAM DIMMs as a reference because there is a >0% chance that anyone reading this has actually ever bought a comparable product themselves.

As for prices, the prices you cited are not at all comparable. Apple is absolutely certainly buying directly from manufacturers without a middleman since we're talking about millions of units delivered each quarter. Based on those quantities, unit prices are guaranteed to be substantially lower than what DigiKey offers.

Based on what little public information I was able to find, spot market prices for LPDDR4 RAM seem to be somewhere in the 3 to 5$ range for 16GB modules. Let's be generous and put LPDDR5 at tripe the price with 15$ a 16GB module. Given the upgrade price for going from 8 to 16GB is 230 EUR Apple is surely making a huge profit on those upgrades alone by selling an essentially unusable base configuration for a supposed "Pro" product.

TremendousJudge
0 replies
5h7m

They have always done this, for some reason people buy it anyway, so they have no incentive to stop doing it.

KeplerBoy
0 replies
5h29m

It's not quite like that. Apple's RAM is in the SoC package, it might be closer to 20$, but still.

oblio
3 replies
3h39m

I imagine you don't have browsers with many tabs.

robin_reala
0 replies
3h25m

The number of tabs you have doesn’t correlate to the number of active web views you have, if you use any browser that unloads background tabs while still saving their state.

adastra22
0 replies
3h26m

I could never understand how people operate with more than a dozen or so open tabs.

GOONIMMUNE
0 replies
2h34m

how many is "many"? I'm also on an M1 Mac 8 GB RAM and I have 146 chrome tabs open without any issues.

gs17
1 replies
2h59m

Mine is 8GB M1 and it is not fine. But the actual issue for me isn't RAM as much as it is disk space, I'm pretty confident if it wasn't also the 128 GB SSD model it would handle the small memory just fine.

I'm still getting at least 16 GB on my next one though.

regularfry
0 replies
2h41m

Yeah, that's definitely a thing. Podman specifically eats a lot.

internet101010
0 replies
4h13m

At this point I don't think the frustration has much to do with the performance but rather RAM is so cheap that intentionally creating a bottleneck to extract another $150 from a customer comes across as greedy, and I am inclined to agree. Maybe the shared memory makes things more expensive but the upgrade cost has always been around the same amount.

It's not quite in the same ballpark as showing apartment or airfare listings without mandatory fees but it is at the ticket booth outside of the stadium.

flawsofar
0 replies
4h59m

Local LLMs are sluggish on my M2 Air 8GB,

but up until these these things I felt I could run whatever I wanted, including Baldur’s Gate 3.

camel-cdr
0 replies
4h38m

Programming has a wierd way of requirering basically nothing some times, but other times you need to build the latest version of your toolchain, or you are working on some similarly huge project that takes ages to compile.

I was using my 4gb ram pinebook pro in public transport yesterday, and decided to turn of all cores except for a single Cortex-A53, to safe some battery. I had no problems for my usecase of a text editor + shell to compile for doing some SIMD programming.

Aurornis
0 replies
4h51m

Same here. My secondary laptop is 8GB of RAM and it's fine.

As devs and power users we'll always have an edge case for higher RAM usage, but the average consumer is going to be perfectly fine with 8GB of RAM.

All of these comments about how 8GB of RAM is going to make it "unusable" or a "waste of components" are absurd.

dialup_sounds
1 replies
5h34m

Most people that consider themselves "power users" aren't even power users, either. Like how being into cars doesn't make you a race car driver.

0x457
0 replies
4h22m

Race car drivers think they are pros and can't even rebuild the engine in their car.

There are different categories of "power users"

Pikamander2
1 replies
5h41m

That would be fine if the 8GB model was also priced for casual and light productivity use cases. But alas, this is Apple we're talking about.

wiseowise
0 replies
0m

MacBook Air starts at 1,199 euro. For insane battery life, amazing performance, great screen and one of the lightest chassis. Find me comparable laptop, I’ll wait.

wvenable
0 replies
4h46m

Isn't this the "Pro" model?

Aurornis
7 replies
4h52m

I have a minimum of 64GB on my all my main developer machines (home, work, laptop), but I have a spare laptop with only 8GB of RAM for lightweight travel.

Despite the entire internet telling me it would be "unusable" and a disaster and a complete disaster, it's actually 100% perfectly fine. I can run IDEs, Slack, Discord, Chrome, and do dev work without a problem. I can't run a lot of VMs or compile giant projects with 10 threads, of course, but for typical work tasks it's just fine.

And for the average consumer, it would also be fine. I think it's obvious that a lot of people are out of touch with normal people's computer use cases. 8GB of RAM is fine for 95% of the population and the other 5% can buy something more expensive.

mananaysiempre
1 replies
4h31m

For me personally, it’s not an issue of being out of touch. I did, in fact, use a 2014 Macbook with an i5 CPU and 16 GB of RAM for nearly a decade and know how often I hit swap and/or OOM on it even without attempting multicore shenanigans which its processor couldn’t have managed anyway.

It’s rather an issue of selling deliberately underpowered hardware for no good reason other than to sell actually up-to-date versions for a difference in price that has no relation to the actual availability or price of the components. The sheer disconnect from any kind of reality offends me as a person whose job and alleged primary competency is to recognize reality then bend it to one’s will.

KeplerBoy
0 replies
3h58m

I don't think we ever were at a point in computing were you could buy a high-end (even entry level macbooks have high-end pricing) laptop with the same amount of ram as you could 10 years earlier.

8 GB were the standard back then.

leptons
1 replies
4h6m

Be honest, that 8GB computer isn't running MacOS, is it.

adastra22
0 replies
3h29m

That’s the standard configuration of a MacBook Air.

elaus
1 replies
4h25m

But why did you configure 3 machines with 64+ GB, if 8 GB RAM are "100% perfectly fine" for typical work tasks?

For me personally 16 or 32 GB are perfectly fine, 8 GB was too little (even without VMs) and I've never needed 64 or more. So it's curious to see you are pretty much exactly the opposite.

joshmanders
0 replies
1h3m

But why did you configure 3 machines with 64+ GB, if 8 GB RAM are "100% perfectly fine" for typical work tasks?

Did you miss this part prefixing that sentence?

I can't run a lot of VMs or compile giant projects with 10 threads, of course
grecy
0 replies
3h28m

I'm editing 4k video and thousands of big RAW images.

The used M1 MacBook Air I just bought is by far the fastest computer I have ever used.

reustle
2 replies
5h43m

If these things will ever get MacOS support

The Macbook line will get iPadOS support long before they allow MacOS on this line. Full steam ahead towards the walled garden.

bluescrn
0 replies
5h28m

iOS has become such a waste of great hardware, especially in the larger form factor of the iPad.

M1 chips, great screens, precise pencil input and keyboard support, but we still aren't permitted a serious OS on it, to protect the monopolistic store.

App Stores have beeen around long enough to prove that they're little more than laboratories in which to carry out accelerated enshittification experimentation. Everything so dumbed down and feature-light, yet demanding subscriptions or pushing endless scammy ads. And games that are a shameless introduction to gambling addiction, targeted at kids.

Most of the 'apps' that people actually use shouldn't need to be native apps anyway, they should be websites. And now we get the further enshittification of trying to force people out of the browser and into apps, not for a better experience, but for a worse one, where more data can be harvested and ads can't so easily be blocked...

arecurrence
0 replies
4h43m

If the iPad could run Mac apps when docked to Magic Keyboard like the Mac can run iPad apps then there may be a worthwhile middle ground that mostly achieves what people want.

The multitasking will still be poor but perhaps Apple can do something about that when in docked mode.

That said, development likely remains a non-starter given the lack of unix tooling.

leptons
0 replies
4h7m

These specific model of tablets won't ever get MacOS support. Apple will tell you when you're allowed to run MacOS on a tablet, and they'll make you buy a new tablet specifically for that.

jiehong
0 replies
4h22m

People always complain dev should write more efficient software, so maybe that’s one way!

At least, chrome wouldn’t run that many tabs on iPad for sure if it used the same engine as desktop chrome

Laaas
34 replies
6h39m

The economic reasoning behind this doesn't make sense to me.

What do they lose by allowing slightly more freedom in configurations?

izacus
16 replies
6h31m

They push you to buy the more expensive model with higher margins.

This is what they did when I was buying iPad Air - it starts with actually problematically low 64GB of Storage... and the 256GB model is the next one with massive price jump.

It's the same kind of "anchoring" (marketing term) that car dealers use to lure you into deciding for their car based on the cheapest 29.999$ model which with "useful" equipment will end you costing like 45.000$

aeyes
15 replies
6h10m

Honest question: What data do you store on an iPad Air? On a phone you might have some photos and videos but isn't a tablet just a media consumption device? Especially on iOS where they try to hide the filesystem as much as possible.

izacus
5 replies
6h7m

No data, but iOS apps have gotten massive, caches have gotten massive and install a game or two and 64GB is gone.

Not to mention that occasionally is nice to have a set of downloaded media available for vacation/travel and 64GB isn't enough to download week worth of content from Netflix.

This is why this is so annoying - you're right, I don't need 512GB or 256GB. But I'd still like to have more than "You're out of space!!" amount.

trogdor
2 replies
5h48m

Where is 64GB coming from?

izacus
1 replies
3h24m

The base iPad Air model - the one the price is most quoted - is 64GB.

nozzlegear
1 replies
4h47m

I've had the original iPad Pro with 64gb since it first released and have somehow never run out of storage. Maybe my problem is that I don't download games. I'd suggest using a USB drive for downloaded media though if you're planning to travel. All of the media apps I use (Netflix, YouTube, Crunchyroll, etc.) support them. That's worked well for me and is one reason I was comfortable buying the 64gb model.

sroussey
0 replies
4h8m

How do you get Netflix to use an external drive?

jdminhbg
3 replies
5h1m

isn't a tablet just a media consumption device?

This is actually most of the storage space -- videos downloaded for consumption in places with no or bad internet.

kmeisthax
0 replies
3h8m

Yes. However, applications have to be specifically written to use external storage, which requires popping open the same file picker you use to interact with non-Apple cloud storage. If they store data in their own container, then that can only ever go on the internal storage, iCloud, or device backups. You aren't allowed to rugpull an app and move its storage somewhere else.

I mean, what would happen if you yanked out the drive while an app was running on it?

izacus
0 replies
3h23m

Even on Android you can't download streaming media to OTG USB storage.

wincy
0 replies
4h7m

We use PLEX for long trips in the car for the kids. Like 24 hour drives. We drive to Florida in the winter and the iPads easily run out of space after we’ve downloaded a season or two of Adventure Time and Daniel Tiger.

I could fit more if I didn’t insist on downloading everything 1080p I guess.

maxsilver
0 replies
5h26m

What data do you store on an iPad Air?

Games. You can put maybe three or four significant games on an iPad Air before it maxes out. (MTG Arena is almost 20GB all on it's own, Genshin Impact is like 40+ GB)

lozenge
0 replies
3h21m

iPad OS is 17 GB and every app seems to think it'll be the only one installed.

lotsofpulp
0 replies
5h36m

Email, password manager, iOS keychain, photos, videos, etc should all be there if synced to iCloud.

imtringued
0 replies
5h40m

As he said, you buy excess storage so that you don't have to think about how much storage you are using. Meanwhile if you barely have enough, you're going to have to play data tetris. You can find 256GB SSDs that sell for as low as 20€. How much money is it worth to not worry about running out of data? Probably more than the cost of the SSD at these prices.

michaelt
4 replies
5h46m

Because before you know it you're Dell and you're stocking 18 different variants of "Laptop, 15 inch screen, 16GB RAM, 512GB SSD" and users are scratching their heads trying to figure out WTF the difference is between a "Latitude 3540" a "Latitude 5540" and a "New Latitude 3550"

gruez
1 replies
4h50m

I can't tell whether this is serious or not. Surely adding independently configurable memory/storage combinations won't confuse the user, any more than having configurable storage options don't make the user confused about what iphone to get?

its_ethan
0 replies
36m

Configuring your iPhone storage is something every consumer has a concept of, it's some function of "how many pictures can I store on it"? When it comes to CPU/GPU/RAM and you're having to configure all three, the average person is absolutely more likely to be confused.

It's anecdotal, but 8/10 people that I know over the age of 40 would have no idea what RAM or CPU configurations even theoretically do for them. This is probably the case for most iPad purchasers, and Apple knows this - so why would they provide expensive/confusing configurability options just for the handful of tech-y people who may care? There are still high/med/low performant variations that those people can choose from, any the number of people for whom that would sour them away from a sale is vanishingly small, and they would be likely to not even be looking at Apple in the first place

skeaker
0 replies
4h45m

Yes, the additional $600 they make off of users who just want extra RAM is just an unfortunate side effect of the unavoidable process of not being Dell. Couldn't be any other reason.

kmeisthax
0 replies
3h12m

Apple already fixed this with the Mac: they stock a handful of configurations most likely to sell, and then everything else is a custom order shipped direct from China. The reason why Apple has to sell specific RAM/storage pairs for iPads is that they don't have a custom order program for their other devices, so everything has to be an SKU, and has to sell in enough quantity to justify being an SKU.

blegr
2 replies
6h38m

It forces you to buy multiple upgrades instead of just the one you need.

jessriedel
1 replies
6h34m

But why does this make them more money than offering separate upgrades at higher prices?

I do think there is a price discrimination story here, but there are some details to be filled in.

foldr
0 replies
6h5m

It's not obvious to me that Apple does make a significant amount of money by selling upgrades. Almost everyone buys the base model. The other models are probably little more than a logistical pain in the butt from Apple's perspective. Apple has to offer more powerful systems to be credible as a platform, but I wouldn't be surprised if the apparently exorbitant price of the upgrades reflects the overall costs associated with complicating the production and distribution lines.

vinkelhake
0 replies
6h35m

It's just Apple's price ladder. The prices on their different SKUs are laid out carefully so that there's never a too big of a jump to the next level.

https://talkbackcomms.com/blogs/news/ladder

naravara
0 replies
6h32m

Logistical efficiencies mostly. It ends up being a lot of additional SKUs to manage, and it would probably discourage people from moving up a price tier if they would have otherwise. So from Apple’s perspective they’re undergoing more hassle (which costs) for the benefit of selling you lower margin products. No upside for them besides maybe higher customer satisfaction, but I doubt it would have moved the needle on that very much.

jessriedel
0 replies
6h37m

I think it’s a price discrimination technique.

gehsty
0 replies
6h15m

Fewer iPad SKUs = more efficient manufacturing and logistics, at iPad scale probably means a very real cost saving.

fallat
0 replies
6h38m

The reasoning is money. Come on.

dragonwriter
0 replies
5h17m

What do they lose by allowing slightly more freedom in configurations?

More costs everywhere in the chain; limiting SKUs is a big efficiency from manufacturing to distribution to retail to support, and it is an easy way (for the same reason) to improve the customer experience, because it makes it a lot easier to not be out of or have delays for a customer’s preferred model, as well as making the UI (online) or physical presentation (brick and mortar) for options much cleaner.

Of course, it can feel worse if you you are a power user with detailed knowledge of your particular needs in multiple dimensions and you feel like you are paying extra for features you don't want, but the efficiencies may make that feeling an illusion — with more freedom, you would being paying for the additional costs that created, so a higher cost for the same options and possibly just as much or more for the particular combination option you would prefer with multidimensional freedom as for the one with extra features without it. Though that counterfactual is impossible to test.

abtinf
0 replies
5h30m

The GP comment can be misleading because it suggests Apple is tying storage to ram. That is not the case (at least not directly).

The RAM and system-on-chip are tied together as part of the system-on-package. The SoP is what enables M chips to hit their incredible memory bandwidth numbers.

This is not an easy thing to allow configuration. They can’t just plug a different memory chip as a final assembly step before shipping.

They only have two SoPs as part of this launch: 9-core CPU with 8gb, and 10-core CPU with 16gb. The RAM is unified for cpu/gpu (and I would assume neural engine too).

Each new SoP is going to reduce economies of scale and increase supply chain complexity. The 256/512gb models are tied to the first package, the 1/2tb models are tied to the second. Again, these are all part of the PCB, so production decisions have to be made way ahead of consumer orders.

Maybe it’s not perfect for each individual’s needs, but it seems reasonable to assume that those with greater storage needs also would benefit from more compute and RAM. That is, you need more storage to handle more video production so you are probably more likely to use more advanced features which make better use of increased compute and RAM.

a_vanderbilt
0 replies
6h36m

Margins and profit. Less variations in production makes for higher efficiency. Segmenting the product line can push consumers to purchase higher tiers of product. It's iOS anyways, and the people who know enough to care how much RAM they are getting are self-selecting for those higher product tiers.

Aurornis
0 replies
4h39m

Several things:

1. Having more SKUs is expensive, for everything from planning to inventory management to making sure you have enough shelf space at Best Buy (which you have to negotiate for). Chances are good that stores like Best Buy and Costco would only want 2 SKUs anyway, so the additional configs would be a special-order item for a small number of consumers.

2. After a certain point, adding more options actually decreases your sales. This is confusing to people who think they'd be more likely to buy if they could get exactly what they wanted, but what you're not seeing is the legions of casual consumers who are thinking about maybe getting an iPad, but would get overwhelmed by the number of options. They might spend days or weeks asking friends which model to get, debating about whether to spend extra on this upgrade or that, and eventually not buying it or getting an alternative. If you simplify the lineup to the "cheap one" and the "high end one" then people abandon most of that overhead and just decide what they want to pay.

The biggest thing tech people miss is that they're not the core consumers of these devices. The majority go to casual consumers who don't care about specifying every little thing. They just want to get the one that fits their budget and move on. Tech people are secondary.

paulpan
18 replies
6h12m

I don't think it's strictly for price gouging/segmentation purposes.

On the Macbooks (running MacOS), RAM has been used as data cache to speed up data read/write performance until the actual SSD storage operation completes. It makes sense for Apple to account for with higher RAM spec for the 1TB/2TB configurations.

pantalaimon
6 replies
6h2m

RAM has been used as data cache to speed up data read/write performance until the actual SSD storage operation completes.

I'm pretty sure that's what all modern operating systems are doing.

wheybags
3 replies
4h26m

I'm writing this from memory, so some details may be wrong but: most high end ssds have dram caches on board, with a capacitor that maintains enough charge to flush the cache to flash in case of power failure. This operates below the system page cache that is standard for all disks and oses.

Apple doesn't do this, and use their tight integration to perform a similar function using system memory. So there is some technical justification, I think. They are 100% price gougers though.

gaudystead
0 replies
3h16m

writing this from memory

Gave me a chuckle ;)

beambot
0 replies
3h26m

One company's "Price Gouging" is another's "Market Segmentation"

eddieroger
1 replies
5h59m

Probably, but since we're talking about an Apple product, comparing it to macOS make sense, since they all share the same bottom layer.

0x457
0 replies
4h30m

Not, probably, that just how any "modern" OS works. It also uses RAM as a cache to avoid reads from storage, just like any other modern OS.

Apple uses it for segmentation and nothing else.

Modern being - since the 80s.

__turbobrew__
4 replies
5h34m

That is called a buffer/page cache and has existed in operating systems since the 1980s.

astrange
1 replies
3h14m

It's unified memory which means the SSD controller is also using the system memory. So more flash needs more memory.

spixy
0 replies
46m

Then give me more memory. 512gb storage with 16gb ram

riazrizvi
0 replies
4h57m

No this is caching with SSDs, it's not the same league.

btown
0 replies
4h56m

With hardware where power-off is only controlled by software, battery life is predictable, and large amounts of data like raw video are being persisted, they might have a very aggressive version of page caching, and a large amount of storage may imply that a scale-up of RAM would be necessary to keep all the data juggling on a happy path. That said, there’s no non-business reasons why they couldn’t extend that large RAM to smaller storage systems as well.

gruez
1 replies
4h53m

How does that justify locking the 16GB option to 1TB/2TB options?

0x457
0 replies
4h25m

Since memory is on their SoC it makes it challenging to maintain multiple SKUs. This segmentation makes to me as a consumer.

lozenge
0 replies
3h26m

The write speed needs to match what the camera can output or the WiFi/cellular can download. It has nothing to do with the total size of the storage.

hosteur
0 replies
2h26m

I don't think it's strictly for price gouging/segmentation purposes.

I think it is strictly for that purpose.

bschne
0 replies
2h54m

Shouldn't the required cache size be dependent on throughput more so than disk size? It does not necessarily seem like you'd need a bigger write cache if the disk is bigger, people who have a 2TB drive don't read/write 2x as much in a given time as those with a 1TB drive. Or am I missing something?

Aerbil313
0 replies
29m

Do people not understand that Apple's 'price gouging' is about UX? A person who has the money to buy a 1TB iPad is worth more than average customer. A 16GB RAM doubtlessly results in a faster UX and that person is more likely to continue purchasing.

samatman
9 replies
6h39m

It's fairly absurd that they're still selling a 256gb "Pro" machine in the first place.

That said, Apple's policy toward SKUs is pretty consistent: you pay more money and you get more machine, and vice versa. The MacBooks are the only product which has separately configurable memory / storage / chip, and even there some combinations aren't manufactured.

phkahler
4 replies
6h27m

> It's fairly absurd that they're still selling a 256gb "Pro" machine in the first place.

My guess is they want you to use their cloud storage and pay monthly for it.

CharlieDigital
1 replies
5h51m

That doesn't make any sense.

I'm not storing my Docker containers and `node_modules` in the cloud.

Pro isn't just images and videos.

davedx
0 replies
5h34m

This is a tablet not a laptop

skydhash
0 replies
4h58m

Or use an external storage. I’d be wary of using my iPad as primary storage anyway. It’s only work in progress and currently watching/reading media.

samatman
0 replies
3h1m

If that were the goal (I don't think it is), they'd be better off shipping enough storage to push people into the 2TB tier, which is $11 vs. $3 a month for 200GB.

I said this in a sibling comment already, but I think it's just price anchoring so that people find the $1500 they're actually going to pay a bit easier to swallow.

azinman2
2 replies
4h55m

I have a 256G iPhone. I think I’m using like 160G. Most stuff is just in the cloud. For an iPad it wouldn’t be any different, modulo media cached for flights. I could see some cases like people working on audio to want a bunch stored locally, but it’s probably in some kind of compressed format such that it wouldn’t matter too much.

What is your concern?

samatman
1 replies
3h4m

I don't know about 'concern' necessarily, but it seems to me that 512GB for the base Pro model is a more realistic minimum. There are plenty of use cases where that amount of storage is overkill, but they're all served better by the Air, which come in the same sizes and as little as 128GB storage.

I would expect most actual users of the Pro model, now that 13 inch is available at the lower tier, would be working with photos and video. Even shooting ProRes off a pro iPhone is going to eat into 256 pretty fast.

Seems like that model exists mainly so they can charge $1500 for the one people are actually likely to get, and still say "starts at $1299".

Then again, it's Apple, and they can get away with it, so they do. My main point here is that the 256GB model is bad value compared to the equivalent Air model, because if you have any work where the extra beef is going to matter, it's going to eat right through that amount of storage pretty quick.

its_ethan
0 replies
30m

I think you're underestimating the number of people who go in to buy an iPad and gravitate to the Pro because it looks the coolest and sounds like a luxury thing. For those people, who are likely just going to use it for web browsing and streaming videos, the cheapest configuration is the only one they care about.

That type of buyer is a very significant % of sales for iPad pros. Despite the marketing, there are really not that many people (as a % of sales) that will be pushing these iPad's anywhere even remotely close to their computational/storage/spec limits.

dijit
0 replies
5h26m

Real creative pros will likely be using a 10G Thunderbolt NIC to a SAN; local video editing is not advised unless it’s only a single project at a time.

Unless you are a solo editor.

giancarlostoro
8 replies
6h20m

Honestly though, that's basically every tablet you cant change the ram, you get what you get and thats it. Maybe they should call them by different names like Pro Max for the ones with 16GB in order to make it more palatable? Small psychological hack.

dotnet00
7 replies
6h17m

The Samsung tablets at least still retain the SD card slot, so you can focus more on the desired amount of RAM and not worry too much about the built-in storage size.

Teever
4 replies
6h5m

It would be cool if regulators mandated that companies like Apple are obligated to provide models of devices with SD card slots and a seamless way to integrate this storage into the OS/applications.

That combined with replaceable batteries would go a long way to reduce the amount of ewaste.

mschuster91
2 replies
5h56m

And then people would stick alphabet-soup SD cards into their devices and complain about performance and data integrity, it's enough of a headache in the Android world already (or has been before Samsung and others finally decided to put in enough storage for people to not rely on SD cards any more).

In contrast, Apple's internal storage to my knowledge always is very durable NVMe, attached logically and physically directly to the CPU, which makes their shenanigans with low RAM size possible in the first place - they swap like hell but as a user you barely notice it because it's so blazing fast.

Teever
1 replies
5h42m

Yeah jackasses are always gonna jackass. There's still a public interest in making devices upgradable for the purpose of minimizing e-waste.

I'd just love to buy a device with a moderate amount of unupgreadable SSD and an SD slot so that I can put a more memory in it later so the device can last longer.

mschuster91
0 replies
3h9m

Agreed but please with something other than microSD cards. Yes, microSD Express is a thing, but both cards and hosts supporting it are rare, the size format doesn't exactly lend itself to durable flash chips, thermals are questionable, and even the most modern microSD Express cards barely hit 800 MB/sec speed, whereas Apple's stuff has hit twice or more that for years [2].

[1] https://winfuture.de/news,141439.html

[2] https://9to5mac.com/2024/03/09/macbook-air-m3-storage-speeds...

davedx
0 replies
5h34m

Not everything has to be solved by regulators. The walled garden is way more important to fix than arbitrary hardware configurations

zozbot234
1 replies
4h41m

Doesn't iPad come with an USB-C port nowadays? You can attach an external SD card reader.

amlib
0 replies
3h20m

Just like I don't want an umbilical cord hanging out of me just to perform the full extent of my bodily functions, I also wouldn't want a dongle hanging off my tablet for it to be deemed usable.

littlestymaar
3 replies
2h8m

8gb of ram.

WTF? Why so little? That's insane to me, that's the amount of RAM you get with a mid-range android phone.

MyFirstSass
2 replies
2h4m

We've reached a point where their chips has become so amazing they have to introduce "fake scarcity" and "fake limits" to sell their pro lines, while dividing their customers into haves and havenots, while actively stalling the entire field for the masses.

littlestymaar
0 replies
1h39m

Yes, but we're talking about the “pro” version here which makes even less sense!

its_ethan
0 replies
28m

You could, alternatively, read less malice into the situation and realize that the majority of people buying an iPad pro don't even need 8gb of RAM to do what they want to do with the device (web browsing + video streaming).

ilikehurdles
3 replies
3h59m

The iPod classic had 160 GB of storage fifteen years ago.

No device should be measuring storage in the gigabytes in 2024. Let alone starting at $1000 offering only 256GB. What ridiculousness.

vrick
0 replies
3h15m

To be fair the ipod classic used a platter drive and ipads are high speed SSD storage. That being said, it's been years of the same storage options and at those prices it should be much higher, along with their iCloud storage offerings.

dmitrygr
0 replies
3h35m

So browse the web and play modern games on an iPod classic

LeoPanthera
0 replies
3h12m

The iPod Classic had spinning rust. Don't pretend it's comparable with a modern SSD.

praseodym
1 replies
6h42m

And also one less CPU performance core for the lower storage models.

05
0 replies
6h14m

Well, they have to sell the dies with failed cores somehow..

sambazi
0 replies
6h10m

8gb of ram

not again

intrasight
0 replies
3h22m

"Think Different" ;)

simonbarker87
49 replies
6h45m

Seeing an M series chip launch first in an iPad must be result of some mad supply chain and manufacturing related hangovers from COVID.

If the iPad had better software and could be considered a first class productivity machine then it would be less surprising but the one thing no one says about the iPads is “I wish this chip were faster”

MuffinFlavored
14 replies
6h25m

To me it just feels like a soft launch.

You probably have people (like myself) trying to keep up with the latest MacBook Air who get fatigued having to get a new laptop every year (I just upgraded to the M3 not too long ago, from the M2, and before that... the M1... is there any reason to? Not really...), so now they are trying to entice people who don't have iPads yet / who are waiting for a reason to do an iPad upgrade.

For $1,300 configured with the keyboard, I have no clue what I'd do with this device. They very deliberately are keeping iPadOS + MacOS separate.

low_common
4 replies
6h22m

You get a new laptop every year?

MuffinFlavored
1 replies
5h51m

I'm sort of "incentivized" to by Apple because as soon as they release a new one, the current device you have will be at "peak trade in value" and deteriorate over time.

It's a negligible amount of money. It's like, brand new $999, trade in for like $450. Once a year... $550 remainder/12 months is $45.75/mo to have the latest and greatest laptop.

fwip
0 replies
5h17m

How much is a 2-year old laptop worth? Because if you buy a new laptop every two years and don't even sell the old one, you're only spending $500 a year, which is less than you are now.

bombcar
0 replies
5h49m

It's not terribly expensive if you trade-in or otherwise sell or hand down the previous.

I went from M1 to M1 Pro just to get more displays.

Teever
0 replies
6h3m

If you replace your laptop every year or two and sell the old one online you can keep on the latest technology for only a slight premium.

jasonjmcghee
3 replies
6h2m

Still using my M1 Air and had no interest in updating to M3. Battery life has dropped a fair amount, but still like 8+ hours. That's going to be the trigger to get a new one. If only batteries lasted longer.

foldr
2 replies
5h54m

I don't think it costs that much to have the battery replaced compared to the price of a new laptop.

jasonjmcghee
1 replies
3h30m

Curious how much it would cost. I think parts are on the order of $150? So maybe $4-500 for official repair?

If I can hold out another year or two, would probably end up just getting a new one

baq
3 replies
5h58m

I feel like I bought the M1 air yesterday. Turns out it was ~4 years ago. Never felt the need to upgrade.

wil421
0 replies
2h0m

The only reason I upgraded is my wife “stole” my M1 air. I bought a loaded M3 MBP and then they came out with a 15” Air with dual monitor capabilities. Kinda wish I had the air again. It’s not like I move it around much but the form factor is awesome.

dialup_sounds
0 replies
5h18m

Interestingly, Apple still sells M1 Airs through Walmart, but not their own website.

Toutouxc
0 replies
4h45m

Same here, my M1 Air still looks and feels like a brand new computer. Like, I still think of it as "my new MacBook". It's my main machine for dev work and some hobby photography and I'm just so happy with it.

mvkel
0 replies
6h19m

I think (hope) wwdc changes this. The function keys on the Magic Keyboard give me hope.

Also, you know you don't HAVE to buy a laptop every year, right?

mort96
12 replies
5h20m

To be honest, I wish my iPad's chip was slower! I can't do anything other than watch videos and use drawing programs on an iPad, why does it need a big expensive power hungry and environmentally impactful CPU when one 1/10 the speed would do?

If I could actually do something with an iPad there would be a different discussion, but the operating system is so incredibly gimped that the most demanding task it's really suited for is .. decoding video.

steveridout
4 replies
5h1m

I'm under the impression that this CPU is faster AND more efficient, so if you do equivalent tasks on the M4 vs an older processor, the M4 should be less power hungry, not more. Someone correct me if this is wrong!

mort96
3 replies
4h56m

It's more power efficient than the M3, sure, but surely it could've been even more power efficient if it had worse performance simply from having fewer transistors to switch? It would certainly be more environmentally friendly at the very least!

Kon-Peki
1 replies
4h35m

The most environmentally friendly thing to do is to keep your A12Z for as long as you can, ignoring the annual updates. And when the time comes that you must do a replacement, get the most up to date replacement that meets your needs. Change your mindset - you are not required to buy this one, or the next one.

mort96
0 replies
4h23m

Of course, I'm not buying this one or any other until something breaks. After all, my current A12Z is way too powerful for iPadOS. It just pains me to see amazing feats of hardware engineering like these iPads with M4 be completely squandered by a software stack which doesn't facilitate more demanding tasks than decoding video.

Millions of people will be buying these things regardless of what I'm doing.

_ph_
0 replies
29m

Look at the efficiency cores. They are all you are looking for.

Shank
2 replies
5h4m

Why does it need a big expensive power hungry and environmentally impactful CPU when one 1/10 the speed would do?

Well, it's not. Every process shrink improves power efficiency. For watching videos, you're sipping power on the M4. For drawing...well if you want low latency while drawing, which generally speaking, people do, you...want the processor and display to ramp up to compensate and carry strokes as fast as possible?

Obviously if your main concern is the environment, you shouldn't upgrade and you should hold onto your existing model(s) until they die.

mort96
1 replies
4h51m

From what I can tell, the 2020 iPad has perfectly fine latency while drawing, and Apple hasn't been advertising lower latencies for each generation; I think they pretty much got the latency thing nailed down. Surely you could make something with the peak performance of an A12Z use less power on average than an M4?

As for the environmental impact, whether I buy or don't buy this iPad (I won't, don't worry, my 2020 one still works), millions of people will. I don't mind people buying powerful machines when the software can make use of the performance, but for iPad OS..?

Kon-Peki
0 replies
4h38m

The M4 is built on the newest, best, most expensive process node (right?). They've got to amortize out those costs, and then they could work on something cheaper and less powerful. I agree that they probably won't, and that's a shame. But still, the M4 is most likely one of the best options for the best use of this new process node.

kylehotchkiss
1 replies
3h8m

Environmentally impactful CPU when one 1/10 the speed would do

Apple's started to roll out green energy charging to devices: https://support.apple.com/en-us/108068

If I had to ballpark estimate this, your iPad probably uses less energy per year than a strand of incandescent holiday lights does in a week. Maybe somebody can work out that math.

mort96
0 replies
2h24m

The environmental concerns I have aren't really power consumption. Making all these CPUs takes a lot of resources.

aurareturn
0 replies
2h35m

To be honest, I wish my iPad's chip was slower! I can't do anything other than watch videos and use drawing programs on an iPad, why does it need a big expensive power hungry and environmentally impactful CPU when one 1/10 the speed would do?

A faster SoC can finish the task with better "work done/watt". Thus, it's more environmentally friendly. Unless you're referring to the resources dedicated to advancing computers such as the food engineers eat and the electricity chip fabs require.

_ph_
0 replies
30m

It has 6 efficiency cores. Every single of them is extremely power efficient, but still faster than an iPad 2-3 generations back. So unless you go full throttle, a M4 is going to be by far the most efficient CPU you can have.

themagician
6 replies
4h10m

Welcome to being old!

Watch a 20-year old creative work on an iPad and you will quickly change your mind. Watch someone who has, "never really used a desktop, [I] just use an iPad" work in Procreate or LumaFusion.

The iPad has amazing software. Better, in many ways, than desktop alternatives if you know how to use it. There are some things they can't do, and the workflow can be less flexible or full featured in some cases, but the speed at which some people (not me) can work on an iPad is mindblowing.

I use a "pro" app on an iPad and I find myself looking around for how to do something and end up having to Google it half the time. When I watch someone who really knows how to use an iPad use the same app they know exactly what gesture to do or where to long tap. I'm like, "How did you know that clicking on that part of the timeline would trigger that selection," and they just look back at you like, "What do you mean? How else would you do it?"

There is a bizarre and almost undocumented design langauge of iPadOS that some people simply seem to know. It often pops up in those little "tap-torials" when a new feature roles out that I either ignore or forget… but other people internalize them.

quasarj
2 replies
3h38m

They can have my keyboard when they pry it from my cold dead hands! And my mouse, for that matter.

themagician
0 replies
3h25m

Oh, I'm with you. But the funny thing is, they won't even want it.

I have two iPads and two pencils—that way each iPad is never without a penicl—and yet I rarely use the pencil. I just don't think about it. But then when I do, I'm like, "Why don't I use this more often? It's fantastic."

I have tried and tried to adapt and I can not. I need a mouse, keyboard, seperate numpad, and two 5K Displays to mostly arrive at the same output that someone can do with a single 11" or 13" screen and a bunch of differnt spaces that can be flicked through.

I desperatedly wanted to make the iPad my primary machine and I could not do it. But, honestly, I think it has more to do with me than the software. I've become old and stubborn. I want to do things my way.

Terretta
0 replies
3h3m

Magic Keyboard is both, and the current (last, as of today) iteration is great.

It is just fine driving Citrix or any web app like VSCode.dev.

aurareturn
2 replies
2h36m

I'm with you. I think HN (and conversational internet) disproportionally contains more laptop people than the public.

A lot of of the younger generation does all their work on their phone and tablet and does not have a computer.

Tiktaalik
1 replies
2h22m

If those younger folks are as the parent says in the creative sector.

iPad has been a workflow gamechanger for folks who use photoshop etc but users are still prevented from coding on it.

themagician
0 replies
1h29m

It's actually come a long way. The workflow is still… sub-optimal, but there are some really nice terminal apps (LaTerminal, Prompt, ShellFish, iSH) which are functional too. Working Copy is pretty dope for working with git once you get you adapt to it.

I do most of my dev on a Pi5 now, so actually working on the iPad is not that difficult.

If they ever release Xcode for iPadOS that would be a true gamechanger.

margalabargala
2 replies
6h41m

Maybe they're clocking it way down. Same performance, double the battery life.

cjauvin
1 replies
6h23m

I very rarely wish the battery of my iPad Pro 2018 would last longer, as it's already so good, even considering the age factor.

0x457
0 replies
4h8m

Yeah, I don't think about charging my iPad throughout the day, and I constantly use it. Maybe it's in the low 20s late at night, but it never bothered me.

Zigurd
2 replies
6h40m

My guess is that the market size fit current yields.

hajile
0 replies
6h21m

They already released all their macbooks and latest iphone on N3B which is the worst-yielding 3nm from TSMC. I doubt yields are the issue here.

It's suspected that the fast release for M4 is so TSMC can move away from the horrible-yielding N3B to N3E.

Unfortunately, N3E is less dense. Paired with a couple more little cores, an increase in little core size, 2x larger NPU, etc, I'd guess that while M3 seems to be around 145mm2, this one is going to be quite a bit larger (160mm2?) with the size hopefully being offset by decreased wafer costs.

a_vanderbilt
0 replies
6h34m

I think this is the most likely explanation. Lower volume for the given product matches supply better, and since it's clocked down and has a lower target for GPU cores it has better yields.

andrewmunsell
1 replies
5h53m

My current assumption is that this has to do with whatever "AI" Apple is planning to launch at WWDC. If they launched a new iPad with an M3 that wasn't able to sufficiently run on-device LLMs or whatever new models they are going to announce in a month, it would be a bad move. The iPhones in the fall will certainly run some new chip capable of on-device models, but the iPads (being announced in the Spring just before WWDC) are slightly inconveniently timed since they have to announce the hardware before the software.

owenpalmer
0 replies
4h48m

interesting theory, we'll see what happens!

DanHulton
1 replies
6h38m

I'm wondering if it's because they're hitting the limits of the architecture, and it sounds way better to compare M4 vs M2 as opposed to vs M3, which they'd have to do if it launched in a Macbook Pro.

mason55
0 replies
5h23m

Eh, they compared the M3 to the M1 when they launched it. People grumbled and then went on with their lives. I don't think they'd use that as a reason for making actual product decisions.

eitally
0 replies
5h32m

My guess is that the M4 and M3 are functionally almost identical so there's no real reason for them to restrict the iPad M4 launch until they get the chip into the MacBook / Air.

drexlspivey
0 replies
4h35m

Meanwhile the mac mini is still on M2

asddubs
0 replies
6h42m

Well, it also affects the battery life, so it's not entirely wasted on the ipad

alexpc201
0 replies
6h15m

To offer something better to those who have an iPad Pro M2 and a more powerful environment to run heavier games.

NorwegianDude
38 replies
5h31m

M4 has Apple’s fastest Neural Engine ever, capable of up to 38 trillion operations per second, which is faster than the neural processing unit of any AI PC today.

I always wonder what crazy meds Apple employees are on. Two RTX 4090s is quite common for hobbyist use, and that is 1321 TOPS each, making two over 69 times more than what Apple claims to be the fastest in the world. That performance is literally less than 1 % of a single H200.

Talk about misleading marketing...

akshayt
13 replies
5h25m

They are referring to integrated npus in current cpus like in the intel cote ultra.

They explicitly mentioned in the event that the industry refers to the neural engine as a NPU

mort96
9 replies
5h18m

But the word they used isn't "NPU" or "neural engine" but "AI PC"??? If I build a PC with a ton of GPU power with the intention of using that compute for machine learning then that's an "AI PC"

fwip
2 replies
5h16m

The technicality they're operating on is that the "AI PC" doesn't have a "neural processing unit."

faster than the neural processing unit of any AI PC today.
mort96
1 replies
5h6m

Ah. I guess you could argue that that's technically not directly false. That's an impressive level of being dishonest without being technically incorrect.

By comparing the non-existent neural engine in your typical AI PC, you could claim that the very first SoC with an "NPU" is infinitely faster than the typical AI PC

oarth
0 replies
2h36m

An AI PC is a PC suited to be used for AI... Dual 4090 is very suited for small scale AI.

It might be a marketing term by Microsoft, but that is just dumb, and has nothing to do with what Apple says. If this was in relation to Microsofts "AI PC" then Apple should have written "Slower than ANY AI PC." instead, as the minimum requirements for "AI PC by Microsoft" seems to be 45 TOPS, and the M4 is too slow to qualify by the Microsoft definition.

Are you heavily invested in Apple stock or somehting? When a company clearly lies and tries to mislead people, call them out on it, don't defend them. Companies are not your friend. Wtf.

SllX
1 replies
5h6m

On paper you’re absolutely correct. AI PC is marketing rubbish out of Wintel. Apple’s doing a direct comparison to that marketing rubbish and just accepting that they’ll probably have to play along with it.

So going by the intended usage of this marketing rubbish, the comparison Apple is making isn’t to GPUs. It’s to Intel’s chips that like Apple’s, integrate, CPU, GPU, and NPU. They just don’t name drop Intel anymore when they don’t have to.

mort96
0 replies
5h1m

If they literally just said that the iPad's NPU is faster than the NPU of any other computer it'd be fine, I would have no issue with it (though it makes you wonder, maybe that wouldn't have been true? Maybe Qualcomm or Rockchip have SoCs with faster NPUs, so the "fastest of any AI PC" qualifier is necessary to exclude those?)

numpad0
0 replies
3h48m

Microsoft/Intel are trying to push this "AI-enabled PC" or whatever for few months, to obsolete laptops without NPU stuffed in unused I/O die space of CPU. Apple weaponized that in this instance.

1: https://www.theregister.com/2024/03/12/what_is_an_ai_pc/

aurareturn
0 replies
5h2m

“AI PC” is what Microsoft and the industry has deemed SoCs that have an NPU in it. It’s not a term that Apple made up. It’s what the industry is using.

Of course, Apple has had an NPU in their SoC since the first iPhone with FaceID.

NorwegianDude
1 replies
5h15m

The text clearly states faster than any AI PC, not that its faster than any NPUs integrated into a CPU.

They could have written it correctly, but that sounds way less impressive, so instead they make up shit to make it sound very impressive.

aurareturn
0 replies
5h1m

https://www.microsoft.com/en-us/americas-partner-blog/2024/0...

It’s the term Microsoft, Intel, AMD, and Qualcomm decided to rally around. No need to get upset at Apple for using the same term as reference for comparison.

Ps. Nvidia also doesn’t like the term because of precisely what you said. But it’s not Apple that decided to use this term.

janalsncm
0 replies
4h59m

I’ve never heard anyone refer to an NPU before. I’ve heard of GPU and TPU. But in any case, I don’t know the right way to compare Apple’s hardware to a 4090.

MBCook
7 replies
5h3m

That’s not a neural processing unit. It’s a GPU.

They said they had the fastest NPU in a PC. Not the fastest on earth (one of the nVidia cards, probably). Not the fastest way you could run something (probably a 4090 as you said). Just the fastest NPU shipping in a PC. Probably consumer PC.

It’s marketing, but it seems like a reasonable line to draw to me. It’s not like when companies draw a line like “fastest car under $70k with under 12 cylinders but available in green from the factory”.

NorwegianDude
4 replies
4h40m

Of course a GPU from Nvidia is also a NPU. People are spending billions each month on Nvidia, because it's a great NPU.

The fact is that a GPU from Nvidia is a much faster NPU than a CPU from Apple.

It is marketing as you say, but it's misleading marketing, on purpose. They could have simply written "the fastest integrated NPU of any CPU" instead. This is something Apple often does on purpose, and people believe it.

MBCook
3 replies
4h21m

A GPU does other things. It’s designed to do something else. That’s why we call it a GPU.

It just happens to be it’s good at neural stuff too.

There’s another difference too. Apple’s NPU is integrated in their chip. Intel and AMD are going the same. A 4090 is not integrated into a CPU.

I’m somewhat guessing. Apple said NPU is the industry term, honestly I’d never heard it before today. I don’t know if the official definition draws a distinction that would exclude GPUs or not.

I simply think the way Apple presented things seemed reasonable. When they made that claim the fact that they might be comparing against a 4090 never entered my mind. If they had said it was the fastest way to run neural networks I would have questioned it, no doubt. But that wasn’t the wording they used.

sudosysgen
0 replies
3h39m

NVidia GPUs basically have an NPU, in the form of Tensor units. They don't just happen to be good at matmul, they have specific hardware designed to run neural networka.

There is no actual distinction. A GPU with Tensor cores(=matmul units) really does have an NPU just as much as a CPU with an NPU (=matmul units).

oarth
0 replies
3h14m

A GPU does other things.

Yes, and so does the M4.

It just happens to be it’s good at neural stuff too.

No, it's no coincidence. Nvidia has been focusing on neural nets, same as Apple.

There’s another difference too. Apple’s NPU is integrated in their chip.

The neural processing capabilities of Nvidia products(Tensor Cores) are also integrated in the chip.

A 4090 is not integrated into a CPU.

Correct, but nobody ever stated that. Apple stated that M4 was faster than any AI PC today, not that it's the fastest NPU integrated into a CPU. And by the way, the M4 is also a GPU.

I don’t know if the official definition draws a distinction that would exclude GPUs or not.

A NPU can be a part of a GPU, a CPU or it's own chip.

If they had said it was the fastest way to run neural networks I would have questioned it,

They said fastest NPU, neural processing unit. It's the term Apple and a few others use for their AI accelerator. The whole point of a AI accelerator is performance and efficiency. If something does a better job at it then it's a better AI accelerator.

lostmsu
0 replies
1h2m

You know G in GPU stands for Graphics, right? So if you want to play a game of words, NVidia's device dedicated to something else is 30 times faster than "fastest" Apple's device dedicated specifically to neural processing.

adrian_b
1 replies
4h30m

Both Intel's and AMD's laptop CPUs include NPUs, and they are indeed slower than M4.

Nevertheless, Apple's bragging is a little weird, because both Intel and AMD have already announced that in a few months they will launch laptop CPUs with much faster NPUs than Apple M4 (e.g. 77 TOPS for AMD), so Apple will hold the first place for only a very short time.

MBCook
0 replies
4h25m

But do you expect them to say it’s the “soon to be second fastest”?

It’s the fastest available today. And when they release something faster (M4 Pro or Mac or Ultra or whatever) they’ll call that the fastest.

Seems fair to me.

smith7018
5 replies
4h37m

You’re calling $3,600 worth of GPUs “quite common for hobbyist use” and then comparing an iPad to a $40,000 AI-centric GPU.

sudosysgen
4 replies
3h51m

It's almost 70x more powerful. A 4 year old 3070 laptop was cheaper when it came out and has about 200 TOPS, 7 times as much. It's just factually incorrect to call it "faster than any AI PC", it's far slower than a cheaper laptop from 4 years ago.

astrange
1 replies
3h9m

"Powerful" isn't the thing that matters for a battery-powered device. Power/perf is.

sudosysgen
0 replies
2h53m

If they thought that peak performance didn't matter, they wouldn't quote peak performance numbers in their comparison, and yet they did. Peak performance clearly matters, even in battery powered devices: many workloads are bursty and latency matters then, and there are workloads where you can be expected to be plugged in. In fact, one such workload is generative AI which is often characterized by burst usage where latency matters a lot, which is exactly what these NPUs are marketed towards.

acdha
1 replies
2h4m

AI PC is a specific marketing term which Intel is using for their NPU-equipped products where they’re emphasizing low-power AI:

https://www.intel.com/content/www/us/en/newsroom/news/what-i...

In that context it seems fair to make the comparison between a MacBook and the PC version which is closest on perf/watt rather than absolute performance on a space heater.

sudosysgen
0 replies
39m

Then make a comparison on perf/watt. As it is, we have no way of knowing if it's better on a perf/watt basis than something like an RTX4050 which is 10 times faster and uses about 10 times the power.

The PC version of accelerated AI workloads, in 2024, is a GPU with optimized matmul cores. It's the most powerful and most efficient way of accelerating neural network loads right now. Comparing to a suboptimal implementation and making it sound like you're comparing to the industry is misleading.

If they are referring to a specific marketing term for a single company, they should do so explicitly. Otherwise, it's just being misleading, because it's not even including Apple's main competition, which is AMD and NVidia, and using a generic sounding term.

phren0logy
1 replies
4h48m

I have a MacBook M2 and a PC with a 4090 ("just" one of them) - the VRAM barrier is usually what gets me with the 4090 when I try to run local LLMs (not train them). For a lot of things, my MacBook is fast enough, and with more RAM, I can run bigger models easily. And, it's portable and sips battery.

The marketing hype is overblown, but for many (most? almost all?) people, the MacBook is a much more useful choice.

ProllyInfamous
0 replies
4h2m

Expanding on this, I have an M2Pro (mini) & a tower w/GPU... but for daily driving the M2Pro idles at 15-35W whereas the tower idles at 160W.

Under full throttle/load, even though the M2Pro is rated as less-performant, it is only using 105W — the tower/GPU are >450W!

talldayo
0 replies
5h23m

Watching this site recover after an Apple press release is like watching the world leaders deliberate Dr. Strangelove's suggestions.

syntaxing
0 replies
5h24m

Definitely misleading but they’re talking about “AI CPU” rather than GPUs. They’re pretty much taking a jab at Intel.

password54321
0 replies
4h23m

Just two 4090s? If you don't have at least 8 4090s do not even call yourself a hobbyist.

numpad0
0 replies
3h34m

Also the 38 TOPS figure is kind of odd. Intel had already shown laptop CPUs with 45 TOPS NPU[1] though it hasn't shipped, and Windows 12 is rumored to require 40 TOPS. If I'm doing math right, (int)38 falls short of both.

1: https://www.tomshardware.com/pc-components/cpus/intel-says-l...

make3
0 replies
5h10m

(A H200 is a five digits datacenter GPU without a display port, it's not what they mean by PC, but your general point still stands)

citizenpaul
0 replies
3h0m

This is standard apple advertising.The best whatever in the world that is the same as some standard thing with a different name. Apple is like clothing makers that "vanity size" their clothes. If you dont know that basically means a size 20 is 30 size 21 is 31 and so on.

Neural processing unit is basically a made up term at this point so of course they can have the fastest in the world.

SkyPuncher
0 replies
4h48m

All of the tech specs comparisons were extremely odd. Many things got compared to the M1, despite the most recent iPad having the M2. Heck, one of the comparisons was to the A11 chip that was introduced nearly 7 years ago.

I generally like Apple products, but I cannot stand the way they present them. They always hide how it compares against the directly previous product.

Aurornis
0 replies
4h38m

It's a marketing trick. They're talking about NPUs specifically, which haven't really been rolled out on the PC side.

So while they're significantly slower than even casual gaming GPUs, they're technically the fastest NPUs on the market.

It's marketing speak.

adrian_b
11 replies
5h43m

The Most Powerful Neural Engine Ever

While it is true that the claimed performance for M4 is better than for the current Intel Meteor Lake and AMD Hawk Point, it is also significantly lower (e.g. around half) than the AI performance claimed for the laptop CPU+GPU+NPU models that both Intel and AMD will introduce in the second half of this year (Arrow Lake and Strix Point).

whynotminot
8 replies
4h15m

will introduce

Incredible that in the future there will be better chips than what Apple is releasing now.

adrian_b
4 replies
3h42m

The point is that it is a very near future, a few months away.

Apple is also bragging very hyperbolically that the NPU they introduce right now is faster than all the older NPUs.

So, while what Apple says, "The Most Powerful Neural Engine Ever" is true now, it will be true for only a few months. Apple has done a good job, so as it is normal, at launch their NPU is the fastest. However this does not deserve any special praise, it is just normal, as normal as the fact that the next NPU launched by a competitor will be faster.

Only if the new Apple NPU would have been slower than the older models, that would have been a newsworthy failure. A newsworthy success would have been only if the new M4 would have had at least a triple performance than it has, so that the competitors would have needed more than a year to catch up with it.

whynotminot
3 replies
3h37m

Is this the first time you're seeing marketing copy? This is an entirely normal thing to do. Apple has an advantage with the SoC they are releasing today, and they are going to talk about it.

I expect we will see the same bragging from Apple's competitors whenever they actually launch the chips you're talking about.

Apple has real silicon shipping right now. What you're talking about doesn't yet exist.

A newsworthy success would have been only if the new M4 would have had at least a triple performance than it has, so that the competitors would have needed more than a year to catch up with it.

So you decide what's newsworthy now? Triple? That's so arbitrary.

I certainly better not see you bragging about these supposed chips later if they're not three times faster than what Apple just released today.

adrian_b
2 replies
3h28m

I said triple, because the competitors are expected to have a double speed in a few months.

If M4 were 3 times faster than it is, it would have remained faster than Strix Point and Arrow Lake, which would have been replaced only next year, giving supremacy to M4 for more than a year.

If M4 were twice faster, it would have continued to share the first position for more than a year. As it is, it will be the fastest for one quarter, after which it will have only half of the top speed.

whynotminot
1 replies
3h17m

And then Apple will release M5 next year, presumably with another increase in TOPS that may well top their competitors. This is how product releases work.

spxneo
0 replies
2h3m

strongly doubt we will see M5 so soon

hmottestad
2 replies
3h53m

Don’t worry. It’s Intel we’re talking about. They may say that it’s coming out in 6 months, but that’s never stopped them from releasing it in 3 years instead.

spxneo
0 replies
2h4m

I literally don't give a fck about Intel anymore they are irrelevant

The taiwanese silicon industrial complex deserves our dollars. Their workers are insanely hard working and it shows in its product.

adrian_b
0 replies
3h36m

AMD is the one that has given more precise values (77 TOPS) for their launch, their partners are testing the engineering samples and some laptop product listings seem to have been already leaked, so the launch is expected soon (presentation in June, commercial availability no more than a few months later).

spxneo
0 replies
2h15m

damn bro thanks for this

here i am celebrating not pulling the trigger on M2 128gb yesterday

now im realizing M4 ain't shit

will wait a few more months for what you described. will probably wait for AMD

Given that Microsoft has defined that only processors with an NPU with 45 TOPS of performance or over constitute being considered an 'AI PC',

so already with 77 TOPS it just destroys M4. Rumoured to hit the market in 2 months or less.

intrasight
0 replies
3h0m

The Most Powerful Neural Engine Ever

that would be my brain still - at least for now ;)

ttul
9 replies
5h49m

An NVIDIA RTX 4090 generates 73 TFLOPS. This iPad gives you nearly half that. The memory bandwidth of 120 GBps is roughly 1/10th of the NVIDIA hardware, but who’s counting!

lemcoe9
4 replies
5h39m

The 4090 costs ~$1800 and doesn't have dual OLED screens, doesn't have a battery, doesn't weigh less than a pound, and doesn't actually do anything unless it is plugged into a larger motherboard, either.

talldayo
2 replies
4h53m

From Geekbench: https://browser.geekbench.com/opencl-benchmarks

Apple M3: 29685

RTX 4090: 320220

When you line it up like that it's kinda surprising the 4090 is just $1800. They could sell it for $5,000 a pop and it would still be better value than the highest end Apple Silicon.

nicce
0 replies
4h28m

A bit off-topic since not applicable for iPad:

Adding also M3 MAX: 86072

I wonder the results if the test would be done on Asahi Linux some day. Apple implementation is fairly unoptimized AFAK.

haswell
0 replies
59m

Comparing these directly like this is problematic.

The 4090 is highly specialized and not usable for general purpose computing.

Whether or not it's a better value than Apple Silicon will highly depend on what you intend to do with it. Especially if your goal is to have a device you can put in your backpack.

janalsncm
0 replies
4h49m

And yet it’s worth it for deep learning. I’d like to see a benchmark training Resnet on an iPad.

kkielhofner
1 replies
5h40m

TOPS != TFLOPS

RTX 4090 Tensor 1,321 TOPS according to spec sheet so roughly 35x.

RTX 4090 is 191 Tensor TFLOPS vs M2 5.6 TFLOPS (M3 is tough to find spec).

RTX 4090 is also 1.5 years old.

imtringued
0 replies
5h26m

Yeah where are the bfloat16 numbers for the neural engine? For AMD you can at least divide by four to get the real number. 16 TOPS -> 4 tflops within a mobile power envelope is pretty good for assisting CPU only inference on device. Not so good if you want to run an inference server but that wasn't the goal in the first place.

What irritates me the most though is people comparing a mobile accelerator with an extreme high end desktop GPU. Some models only run on a dual GPU stack of those. Smaller GPUs are not worth the money. NPUs are primarily eating the lunch of low end GPUs.

jocaal
0 replies
5h3m

The memory bandwidth of 120 GBps is roughly 1/10th of the NVIDIA hardware, but who’s counting

Memory bandwidth is literally the main bottleneck when it comes to the types of applications gpus are used for, so everyone is counting

brigade
0 replies
5h6m

It would also blow through the iPad’s battery in 4 minutes flat

paulpan
7 replies
5h40m

The fact that TSMC publishes their own metrics and target goals for each node makes it straightforward to compare the transistor density, power efficiency, etc.

The most interesting aspect of the M4 is simply it's debuting on the iPad lineup, whereas historically it's always been on the iPhone (for A-series) and Macbook (for M-series). Makes sense given low expected yielded for the newest node for one of Apple's lower volume products.

For the curious, the original TSMC N3 node had a lot of issues plus was very costly so makes sense to move away from it: https://www.semianalysis.com/p/tsmcs-3nm-conundrum-does-it-e...

spenczar5
6 replies
4h56m

iPads are actually much higher volume than Macs. Apple sells about 2x to 3x as many tablets as laptops.

Of course, phones dwarf both.

andy_xor_andrew
4 replies
4h23m

The iPad Pros, though?

I'm very curious how much iPad Pros sell. Out of all the products in Apple's lineup, the iPad Pro confuses me the most. You can tell what a PM inside Apple thinks the iPad Pro is for, based on the presentation: super powerful M4 chip! Use Final Cut Pro, or Garageband, or other desktop apps on the go! Etc etc.

But in reality, who actually buys them, instead of an iPad Air? Maybe some people with too much money who want the latest gadgets? Ever since they debuted, the general consensus from tech reviewers on the iPad Pro has been "It's an amazing device, but no reason to buy it if you can buy a MacBook or an iPad Air"

Apple really wants this "Pro" concept to exist for iPad Pro, like someone who uses it as their daily work surface. And maybe some people exist like that (artists? architects?) but most of the time when I see an iPad in a "pro" environment (like a pilot using it for nav, or a nurse using it for notes) they're using an old 2018 "regular" iPad.

transpute
1 replies
4h15m

iPadOS 16.3.1 can run virtual machines on M1/M2 silicon, https://old.reddit.com/r/jailbreak/comments/18m0o1h/tutorial...

Hypervisor support was removed from the iOS 16.4 kernel, hopefully it will return in iPadOS 18 for at least some approved devices.

If not, Microsoft/HP/Dell/Lenovo Arm laptops with M3-competitive performance are launching soon, with mainline Linux support.

dmitrygr
0 replies
3h32m

Microsoft/HP/Dell/Lenovo Arm laptops with M3-competitive performance are launching soon, with mainline Linux support.

I have been seeking someone who’ll be willing to put money on such a claim. I’ll bet the other way. Perchance you’re the person I seek, if you truly believe this?

kmeisthax
0 replies
1h48m

artists? architects?

Ding ding ding ding ding! The iPad Pro is useful primarily for those people. Or at least it was. The original selling point of the Pro was that it had[0] the Apple Pencil and a larger screen to draw on. The 2021 upgrade gave the option to buy a tablet with 16GB of RAM, which you need for Procreate as that has very strict layer limits. If you look at the cost of dedicated drawing tablets with screens in them, dropping a grand on an iPad Pro and Pencil is surprisingly competitive.

As for every other use case... the fact that all these apps have iPad versions now is great, for people with cheaper tablets. The iPad Air comes in 13" now and that'll satisfy all but the most demanding Procreate users anyway, for about the same cost as the Pro had back in 2016 or so. So I dunno. Maybe someone at Apple's iPad division just figured they need a halo product? Or maybe they want to compete with the Microsoft Surface without having to offer the flexibility (and corresponding jank) of a real computer? I dunno.

[0] sold separately, which is one of my biggest pet peeves with tablets

intrasight
0 replies
3h2m

Totally agree about "Pro". Imagine if they gave it a real OS. Someone yesterday suggested to dual-boot. At first I dismissed that idea. But after thinking about it, I can see the benefits. They could leave ipadOS alone and create a bespoke OS. They certainly have the resources to do so. It would open up so many new sales channels for a true tablet.

wpm
0 replies
2h51m

iPads as a product line sure, but the M4 is only in the Pros at the moment which are likely lower volume than the MacBook Air.

bearjaws
4 replies
5h46m

We will have M4 laptops running 400B parameter models next year. Wild times.

visarga
3 replies
4h25m

And they will fit in the 8GB RAM with 0.02 bit quant

gpm
2 replies
4h9m

You can get a macbook pro with 128 GB of memory (for nearly $5000).

Which still implies... a 2 bit quant?

freeqaz
1 replies
3h29m

There are some crazy 1/1.5 bit quants now. If you're curious I'll try to dig up the papers I was reading.

1.5bit can be done to existing models. The 1 bit (and less than 1 bit iirc) requires training a model from scratch.

Still, the idea that we can have giant models running in tiny amounts of RAM is not completely far fetched at this point.

gpm
0 replies
3h20m

Yeah, I'm broadly aware and have seen a few of the papers, though I definitely don't try and track the state of the art here closely.

My impression and experience trying low bit quants (which could easily be outdated by now) is that you are/were better off with a smaller model and a less aggressive quantization (provided you have access to said smaller model with otherwise equally good training). If that's changed I'd be interested to hear about it, but definitely don't want to make work for you digging up papers.

dsign
34 replies
6h44m

M4 makes the new iPad Pro an outrageously powerful device for artificial intelligence.

Isn’t there a ToS prohibition about “custom coding” in iOS? Like, the only way you can ever use that hardware directly is for developers who go through Apple Developer Program, which last time I heard was bitter lemon? Tell me if I’m wrong.

freedomben
27 replies
6h38m

Well, this is the heart of the "appliance" model. iPads are appliances. You wouldn't ask about running custom code on your toaster or your blender, so you shouldn't ask about that for your iPad. Also all the common reasons apply: Security and Privacy, Quality Control, Platform Stability and Compatibility, and Integrated User Experience. All of these things are harmed when you are allowed to run custom coding.

(disclaimer: My personal opinion is that the "appliance" model is absurd, but I've tried to steel-man the case for it)

kibwen
18 replies
6h34m

> You wouldn't ask about running custom code on your toaster or your blender, so you shouldn't ask about that for your iPad.

Of course I would, and the only reason other people wouldn't is because they're conditioned to believe in their own innate powerlessness.

If you sell me a CPU, I want the power to program it, period.

freedomben
4 replies
6h27m

I mean this sincerely, are you really an Apple customer then? I feel exactly the same as you, and for that reason I don't buy Apple products. They are honest about what they sell, which I appreciate.

judge2020
3 replies
6h25m

Some arguments are that you shouldn’t be able to create appliances, only general purpose machines.

taylodl
0 replies
6h5m

Ever notice people don't build their own cars anymore? They used to even up through the 60's. I mean ordering a kit or otherwise purchasing all the components and building the car. Nowadays it's very rare that people do that.

I'm old enough to remember when people literally built their own computers, soldering iron in hand. People haven't done that since the early 80's.

Steve Jobs' vision of the Mac, released in 1984, was for it to be a computing appliance - "the computer for the rest of us." The technology of the day prevented that. Though they pushed that as hard as they could.

Today's iPad? It's the fulfillment of Steve Jobs' original vision of the Mac: a computing appliance. It took 40 years, but we're here.

If you don't want a computing appliance then don't buy an iPad. I'd go further and argue don't buy any tablet device. Those that don't want computing appliances don't have to buy them. It's not like laptops, or even desktops, are going anywhere anytime soon.

nordsieck
0 replies
5h48m

Some arguments are that you shouldn’t be able to create appliances, only general purpose machines.

I sincerely hope that you live as much of your life in that world as possible.

Meanwhile, I'll enjoy having a car I don't have to mess with every time I start it up.

bluescrn
0 replies
5h8m

In a world concerned with climate change, we should see many of these 'appliances' as inherently wasteful.

On top of the ugly reality that they're designed to become e-waste as soon as the battery degrades.

umanwizard
3 replies
5h33m

That may be your personal preference, but you should accept that 99% of people don't care about programming their toaster, so you're very unlikely to ever make progress in this fight.

mort96
1 replies
5h17m

99% of people don't care about programming anything, that doesn't make this gatekeeping right.

kjkjadksj
0 replies
4h6m

You aren’t wrong but businesses aren’t in the market to optimize for 1% their customers

doctor_eval
0 replies
33m

Yeah, if I have to program my toaster, I’m buying a new toaster.

I write enough code during the day to make me happy. I really don’t want to be thinking about the optimal brownness of my bagel.

taylodl
2 replies
6h20m

> If you sell me a CPU, I want the power to program it, period.

Uhhh, there are CPUs in your frickin' wires now, dude! There are several CPUs in you car for which you generally don't have access. Ditto for your fridge. Your microwave. Your oven. Even your toaster.

We're literally awash in CPUs. You need to update your thinking.

Now, if you said something like "if you sell me a general-purpose computing device, then I want the power to program it, period" then I would fully agree with you. BTW, you can develop software for your own personal use on the iPad. It's not cheap or easy (doesn't utilize commonly-used developer tooling), but it can be done without having to jump through any special hoops.

Armed with that, we can amend your statement to "if you sell me a general-purpose computing device, then I want the power to program it using readily-available, and commonly-utilized programming tools."

I think that statement better captures what I presume to be your intent.

talldayo
0 replies
5h32m

but it can be done without having to jump through any special hoops.

You are really stretching the definition of "special hoops" here. On Android sideloading is a switch hidden in your settings menu; on iOS it's either a municipal feature or a paid benefit of their developer program.

Relative to every single other commercial, general-purpose operating system I've used, I would say yeah, Apple practically defines what "special hoops" look like online.

duped
0 replies
3h42m

I do actually want the ability to program the CPUs in my car the same way I'm able to buy parts and mods for every mechanical bit in there down to the engine. In fact we have laws about that sort of thing that don't apply to the software.

owenpalmer
2 replies
4h46m

the desire to program one's toaster is the most HN thing I've seen all day XD

BrianHenryIE
1 replies
4h30m

I really wish I could program my dishwasher because it's not cleaning very well and if I could add an extra rinse cycle I think it would be fine.

kjkjadksj
0 replies
4h5m

Start by cleaning the filters

theshrike79
1 replies
6h23m

Can you do that to your car infotainment system btw?

blacklion
0 replies
5h52m

Why not?

It MUST (RFC2119) be airgapped from ABS and ECU, of course.

kjkjadksj
0 replies
4h5m

And engineer your own bagel setting without buying a bagel model? Dream on.

paxys
2 replies
6h7m

An appliance manufacturer isn't doing an entire press event highlighting how fast the CPU on the appliance is.

worthless-trash
0 replies
5h22m

If its advertised like a general purpose computer, expectations should be met.

freedomben
0 replies
3h6m

Agree completely. I think it's absurd that they talk about technical things like CPU and memory in these announcements. It seems to me like an admission that it's not really an "appliance" but trying to translate Apple marketing into logical/coherent concepts can be a frustrating experience. I just don't try anymore.

jebarker
1 replies
6h35m

Lots of people ask about running custom code on other appliances. I think they call them hackers.

freedomben
0 replies
6h8m

I think you're reinforcing Apple's point about how security is harmed by allowing custom code.

ben-schaaf
1 replies
5h20m

I appreciate the steel-man. A strong counter argument for me is that you actually can run any custom code on an iPad, as long as it's in a web-browser. This is very unlike an appliance where doing so is not possible. Clearly the intention is for arbitrary custom code to run on it, which makes it a personal computer and not an appliance (and should be regulated as such).

freedomben
0 replies
3h5m

That's a fair point, although (steel-manning) the "custom code" in the browser is severely restricted/sandboxed, unlike "native" code would be. So from that perspective, you could maybe expand it to be like a toaster that has thousands of buttons that can make for hyper-specific stuff, but can't go outside of the limits the manufacturer built in.

shepherdjerred
0 replies
6h22m

If I could deploy to my blender as easily as I can to AWS, then I would _definitely_ at least try it.

eqvinox
1 replies
6h35m

As with any Apple device — or honestly, any computing device in general — my criteria of evaluation would be the resulting performance if I install Linux on it. (If Linux is not installable on the device, the performance is zero. If Linux driver support is limited, causing performance issues, that is also part of the equation.)

NB: those are my criteria of evaluation. Very personally. I'm a software engineer, with a focus on systems/embedded. Your criteria are yours.

(But maybe don't complain if you buy this for its "AI" capabilities only to find out that Apple doesn't let you do anything "unapproved" with it. You had sufficient chance to see the warning signs.)

wishfish
0 replies
6h24m

There's the potential option of Swift Playgrounds which would let you write / run code directly on the iPad without any involvement in the developer program.

sergiotapia
0 replies
6h26m

You're not wrong. It's why I don't use apple hardware anymore for work or play. On Android and Windows I can build and install whatever I like, without having to go through mother-Apple for permission.

philipwhiuk
0 replies
6h23m

C'mon man, it's 2024, they can't just not mention AI in a press release.

killerstorm
0 replies
6h33m

It means you can deliver AI apps to users. E.g. generate images.

giancarlostoro
24 replies
6h39m

Call me crazy, but I want all that power in a 7" tablet. I like 7" tablets most because they feel less clunky to carry around and take with you. Same with 13" laptops, I'm willing to sacrifice on screen real estate for saving myself from the back pain of carrying a 15" or larger laptop.

Some of this is insanely impressive. I wonder how big the OS ROM (or whatever) is with all these models. For context, even if the entire OS is about 15GB, in order to get some of these features locally just for an LLM on its own, its about 60GB or more, for something ChatGPT esque. Which requires me to spend thousands on a GPU.

Apologies for the many thoughts, I'm quite excited by all these advancements. I always say I want AI to work offline and people tell me I'm moving the goalpost, but it is truly the only way it will become mainstream.

ChrisMarshallNY
9 replies
6h28m

I've been using the iPad Mini for years.

I'd love to see them add something to that form factor.

I do see a lot of iPad Minis out there, but usually, as part of dedicated systems (like PoS, and restaurant systems).

On the other hand, I have heard rumblings that Apple may release an even bigger phone, which I think might be overkill (but what do I know. I see a lot of those monster Samsung beasts, out there).

Not sure that is for me. I still use an iPhone 13 Mini.

I suspect that my next Mac will be a Studio. I guess it will be an M4 Studio.

wiredfool
5 replies
6h18m

I loved my ipad mini. It's super long in the tooth now, and I was hoping to replace it today. oh well...

giancarlostoro
2 replies
5h48m

I wish they would stop doing this weird release cycle where some of their tablets don't get the updated chips. It's really frustrating. Makes me hesitant to buy a tablet if I feel like it could get an upgrade a week later or whatever.

wiredfool
0 replies
2h15m

I probably would have pulled the trigger on a price drop, but at 600+eur for an old version, I'm just not as into that, as I really expect it to be lasting many years.

JohnBooty
0 replies
5h16m

It certainly seems less than ideal for pro/prosumer buyers who care about the chips inside.

I would guess that Apple doesn't love it either; one suspects that the weird release cycle is at least partially related to availability of chips and other components.

Menu_Overview
1 replies
5h1m

I was ready to buy one today, too. Disappointing.

I miss my old iPad mini 4. I guess I could try the 11" iPad, but I think I'd prefer it to be smaller.

wiredfool
0 replies
2h14m

Yeah, We've got a full sized iPad here and it's really strange to hold and use. It's all what you're used to.

giancarlostoro
2 replies
6h17m

I wanted to buy a Mini, but they had not updated the processors for them when I was buying, and they cost way more than a regular iPad at the time, I wanted to be budget conscious. I still sometimes regret not just going for the Mini, but I know eventually I'll get one sooner or later.

You know whats even funnier, when the mini came out originally, I made fun of it. I thought it was a dumb concept, oh my ignorance.

ectospheno
0 replies
5h13m

I have access to multiple iPad sizes and I personally only use the mini. Is almost perfect. Last year of its long life cycle you start to feel the age of the processor but still better than holding the larger devices. Can’t wait for it to be updated again.

ChrisMarshallNY
0 replies
6h14m

I have an iPad Pro 13", and never use it (It's a test machine).

I use the Mini daily.

It's a good thing they made the Pro lighter and thinner. May actually make it more useful.

onlyrealcuzzo
3 replies
6h37m

Call me crazy, but I want all that power in a 7" tablet

Aren't phones getting close to 7" now? The iPhone pro is 6.2", right?

jwells89
0 replies
6h10m

Biggest difference is aspect ratio. Phones are taller and less pleasant to use in landscape, tablets are more square and better to use in landscape.

You could technically make a more square phone but it wouldn’t be fun to hold in common positions, like up to your ear for a call.

jsheard
0 replies
6h24m

Yeah, big phones have become the new small tablet.

https://phonesized.com/compare/#2299,156

Take away the bezels on the tablet and there's not a lot of difference.

giancarlostoro
0 replies
6h18m

I'm not a huge fan of it, but yeah they are. I actually prefer my phones to be somewhat smaller.

sulam
1 replies
6h18m

If your back is hurting from the ~1lb extra going from 13" to 15", I would recommend some body weight exercises. Your back will thank you, and you'll find getting older to be much less painful.

Regarding a small iPad, isn't that the iPad mini? 8" vs 7" is pretty close to what you're asking for.

teaearlgraycold
0 replies
5h23m

I highly recommend doing pull-ups for your posture and health. It was shocking to me how much the state of my spine improved after doing pull-ups as a daily exercise.

alexpc201
1 replies
6h11m

You can’t have all that power in a 7” tablet because the battery will last half hour.

JohnBooty
0 replies
5h11m

Well, maybe. The screen (and specifically the backlight) is a big drain. Smaller screen = less drain.

r0fl
0 replies
6h27m

The next iPhone pro max will be 6.9 inches

That fits all your wants

notatoad
0 replies
4h38m

a 7" tablet was a really cool form factor back in the day when phones were 4".

but when 6.7" screens on phones are common, what really is the point of a 7" tablet?

bschmidt1
0 replies
5h15m

I always say I want AI to work offline

I'm with you, I'm most excited about this too.

Currently building an AI creative studio (make stories, art, music, videos, etc.) that runs locally/offline (https://github.com/bennyschmidt/ragdoll-studio). There is a lot of focus on cloud with LLMs but I can't see how the cost will make much sense for involved creative apps like video creation, etc. Present day users might not have high-end machines, but I think they all will pretty soon - this will make them buy them the way MMORPGs made everyone buy more RAM. Especially the artists and creators. Remember, Photoshop was once pretty difficult to run, you needed a great machine.

I can imagine offline music/movies apps, offline search engines, back office software, etc.

ant6n
0 replies
5h7m

I’ve got an iPad mini. The main issue is the screen scratches. The other main issue is the screen is like a mirror, so it can’t be used everywhere to watch videos (which is the main thing the iPad is useful for). The third main issue is that videos nowadays are way too dark and you can’t adjust brightness/gamma on the iPad to compensate.

(Notice a theme?)

Foobar8568
0 replies
5h59m

I am not a large person by any means, yet I have no problem to carry a MBP 16...But then I have a backpack and not a messenger like bag, which I would agree, would be a pain to carry.

Aurornis
0 replies
4h34m

Call me crazy, but I want all that power in a 7" tablet. I like 7" tablets most because they feel less clunky to carry around and take with you.

iPhone Pro Max screen size is 6.7" and the the upcoming iPhone 16 Pro Max is rumored to be 6.9" with 12GB of RAM. That's your 7" tablet right there.

The thing is - You're an extreme edge case of an edge case. Furthermore, I'm guessing if Apple did roll out a 7" tablet, you'd find some other thing where it isn't exactly 100% perfectly meeting your desired specifications. For example, Apple is about to release a high powered 6.9" tablet-like device (the iPhone 16 Pro Max) but I'm guessing there's another reason why it doesn't fit your needs.

Which is why companies like Apple ignore these niche use cases and focus on mainstream demands. The niche demands always gain a lot of internet chatter, but when the products come out they sell very poorly.

slashdev
23 replies
6h34m

I've got a Mac Pro paperweight because the motherboard went. It's going to the landfill. I can't even sell it for parts because I can't erase the SSD. If they didn't solder everything to the board you could actually repair it. When I replace my current Dell laptop, it will be with a repairable framework laptop.

stuff4ben
11 replies
6h28m

Just because you lack the skills to fix it, doesn't mean it's not repairable. People desolder components all the time to fix phones and ipads and laptops.

https://www.youtube.com/watch?v=VNKNjy3CoZ4

AzzyHN
7 replies
6h9m

Any other computer I could simply replace the motherboard with several other compatible motherboards, no soldering or donor board needed.

kaba0
6 replies
6h5m

It’s almost like “any other computer” is not thin as a finger, packed to the brim with features that require miniaturization.

Can you just fix an F1 engine with a wrench?

mort96
1 replies
5h13m

That stuff makes it more difficult to work on, but it doesn't make it impossible for Apple to sell replacement motherboards... nor does making a "thin desktop" require soldering on SSDs, M.2 SSDs are plenty thin for any small form factor desktop use case.

slashdev
0 replies
3h23m

They do it deliberately. They want you to throw it out and buy a new one

sniggers
0 replies
3h31m

The mental gymnastics Apple fanboys will do to defend being sold garbage are amazing.

slashdev
0 replies
3h42m

My Dell laptop is much more repairable. I changed the RAM and added second SSD myself.

Rebelgecko
0 replies
5h56m

I'm not sure which gen Mac Pro they have, but the current ones aren't that much thinner than the OG cheese grater Macs from 15 years ago.

In fact the current Gen is bigger than the trashcan ones by quite a bit (although IIRC the trash can Macs had user replaceable SSDs and GPUs)

nicce
1 replies
6h19m

In this case, you need to find working motherboard without soldered parts to be able to fix it cost efficiently. Otherwise you need to buy factory component (for extra price, with soldered components...)

slashdev
0 replies
3h21m

Yeah, it’s not worth it

slashdev
0 replies
3h24m

There’s always some wiseass saying “skill issue”

ipqk
4 replies
5h8m

hopefully at least electronics recycling.

slashdev
2 replies
3h26m

Where do you usually take it for that?

If I find a place in walking distance, maybe.

kylehotchkiss
1 replies
3h5m

You could try to stick it in the phones drop off thingy at target. That's my go to for all non valuable electronics.

slashdev
0 replies
2h50m

I don't have that here, but maybe there's something similar

slashdev
0 replies
1h49m

Nothing close enough, I checked

kylehotchkiss
2 replies
3h5m

Why don't you take it to the Apple Store to recycle it instead of dropping it in the trash can?

slashdev
1 replies
1h50m

They don’t accept computers for recycling. That’s what I found when I looked it up

kylehotchkiss
0 replies
1h10m

They accept Apple branded computers for recycling if it has no trade in value (they'll try to get you an offer if it has any value). I have recycled damaged apple computers at the store before without trading in.

acdha
1 replies
1h57m

I can't even sell it for parts because I can't erase the SSD

The SSD is encrypted with a rate-limited key in the Secure Enclave - unless someone has your password they’re not getting your data.

slashdev
0 replies
1h51m

Not worth the liability. I’d rather the landfill and peace of mind than the money

kjkjadksj
0 replies
4h1m

Even repairable only buys you a few years repairability that actually makes sense. For example something similar happened to me, lost the mac mobo on a pre solder addiction model. Only thing is guess how much a used mobo is for an old mac: nearly as much as the entire old mac in working shape. It makes no sense to repair it once the computer hits a certain age between the prices of oem parts and the depreciation of computers.

ramboldio
22 replies
6h49m

if only macOS would run on iPad..

vbezhenar
7 replies
6h39m

Please no. I don't want no touch support in macOS.

jwells89
3 replies
6h30m

It might work if running in Mac mode required a reboot (no on the fly switching between iOS and macOS) and a connected KB+mouse, with the touch part of the screen (aside from Pencil usage) turning inert in Mac mode.

Otherwise yes, desktop operating systems are a terrible experience on touch devices.

vbezhenar
2 replies
6h12m

It might work if running in Mac mode required a reboot (no on the fly switching between iOS and macOS) and a connected KB+mouse, with the touch part of the screen (aside from Pencil usage) turning inert in Mac mode.

Sounds like strictly worse version of Macbook. Might be useful for occasional work, but I expect people who would use this mode continuously just to switch to Macbook.

jwells89
0 replies
6h6m

The biggest market would be for travelers who essentially want a work/leisure toggle.

It’s not too uncommon for people to carry both an iPad and MacBook for example, but a 12.9” iPad that could reboot into macOS to get some work done and then drop back to iPadOS for watching movies or sketching could replace both without too much sacrifice. There’s tradeoffs, but nothing worse than what you see on PC 2-in-1’s, plus no questionable hinges to fail.

ginko
0 replies
5h44m

MacBooks even the air are too large and heavy imo. A 10-11 inch tablet running a real os would be ideal for travel imo.

Nevermark
1 replies
6h26m

Because it would be so great you couldn’t help using it? /h

What would be the downside to other’s using it?

I get frustrated that Mac doesn’t respond to look & pinch!

vbezhenar
0 replies
6h18m

Because I saw how this transformed Windows and GNOME. Applications will be reworked with touch support and become worse for me.

jonhohle
0 replies
6h8m

Why would you need it? Modern iPads have thunderbolt ports (minimally USB-C) and already allow keyboards, network adapters, etc. to be connected. It would be like an iMac without the stand and an option to put it in a keyboard enclosure. Sounds awesome.

tcfunk
4 replies
6h37m

I'd settle for some version of xcode, or some other way of not requiring a macOS machine to ship iOS apps.

JimDabell
2 replies
6h24m

Swift Playgrounds, which runs on the iPad, can already be used to build an app and deploy it to the App Store without a Mac.

alexpc201
1 replies
6h9m

You can’t make a decent iOS app with Swift Playgrounds, its just a toy for kids to learn to code.

interpol_p
0 replies
5h41m

You're probably correct about it being hard to make a decent iOS app in Swift Playgrounds, but it's definitely not a toy

I use it for work several times per week. I often want to test out some Swift API, or build something in SwiftUI, and for some reason it's way faster to tap it out on my iPad in Swift Playgrounds than to create a new project or playground in Xcode on my Mac — even when I'm sitting directly in front of my Mac

The iPad just doesn't have the clutter of windows and communication open like my mac does that makes it hard to focus on resolving one particular idea

I have so many playground files on my iPad, a quick glance at my project list: interactive gesture-driven animations, testing out time and date logic, rendering perceptual gradients, checking baseline alignment in SF Symbols, messing with NSFilePresenter, mocking out a UI design, animated text transitions, etc

hot_gril
0 replies
4h0m

It needs a regular web browser too.

umanwizard
2 replies
5h31m

They make a version of iPad that runs macOS, it is called a MacBook Pro.

zuminator
1 replies
5h13m

MacBook Pros have touchscreens and Apple Pencil compatibility now?

umanwizard
0 replies
3h12m

Fair enough, I was being a bit flippant. It’d be nice if that existed, but I suspect Apple doesn’t want it to for market segmentation reasons.

ranyefet
1 replies
5h24m

Just give us support for virtualization and we could install Linux on it and use it for development.

LeoPanthera
0 replies
3h6m

UTM can be built for iOS.

user90131313
0 replies
6h47m

All that power so ipad stays limited like a toy.

swozey
0 replies
6h27m

Yeah, I bought one of them a few years ago planning to use it for a ton of things.

Turns out I only use it on flights to watch movies because I loathe the os.

havaloc
0 replies
6h35m

Maybe this June there'll be an announcement, but like Lucy with the football, I'm not expecting it. I would instabuy if this was the case, especially with a cellular iPad.

Eun
0 replies
6h43m

then a lot of people would buy it, including me :-)

Takennickname
21 replies
6h47m

Why even have an event at this point? There's literally nothing interesting.

throwaway11460
14 replies
6h45m

2x better performance per watt is not interesting? Wow, what a time to be alive.

LoganDark
10 replies
6h42m

To me, cutting wattage in half is not interesting, but doubling performance is interesting. So performance per watt is actually a pretty useless metric since it doesn't differentiate between the two.

of course efficiency matters for a battery-powered device, but I still tend to lean towards raw power over all else. Others may choose differently, which is why other metrics exist I guess.

throwaway5959
5 replies
6h35m

It means a lot to me, because cutting power consumption in half for millions of devices means we can turn off power plants (in aggregate). It’s the same as lightbulbs; I’ll never understand why people bragged about how much power they were wasting with incandescents.

yyyk
2 replies
6h19m

cutting power consumption in half for millions of devices means we can turn off power plants

It is well known that software inefficiency doubles every couple years, that is, the same scenario would take 2x as much compute, given entire software stack (not disembodied algorithm which will indeed be faster).

The extra compute will be spent on a more abstract UI stack or on new features, unless forced by physical constraints (e.g. inefficient batteries of early smartphone), which is not the case at present.

throwaway11460
1 replies
6h13m

That's weird - if software gets 2x worse every time hardware gets 2x better, why did my laptop in 2010 last 2 hours on battery while the current one lasts 16 doing much more complex tasks for me?

yyyk
0 replies
6h3m

Elsewhere in the comments, it is noted Apple's own estimates are identical despite allegedly 2x better hardware.

Aside, 2 hours is very low even for 2010. There's a strongly usability advantage for going to 16. But going from 16 to 128 won't add as much. The natural course of things is to converge on a decent enough number and 'spend' the rest on more complex software, a lighter laptop etc.

codedokode
0 replies
6h12m

First, only CPU power consumption is reduced, not other components, second, I doubt tablets contribute significantly to global power consumption, so I think no power plants will be turned off.

Nevermark
0 replies
6h17m

They like bright lights?

I have dimmable LED strips around my rooms, hidden by cove molding, reflecting off the whole ceiling, which becomes a super diffuse, super bright “light”.

I don’t boast about power use, but they are certainly hungry.

For that I get softly defuse lighting with a max brightness comparable to outdoor clear sky daylight. Working from home, this is so nice for my brain and depression.

throwaway11460
3 replies
6h40m

This still means you can pack more performance into the chip though - because you're limited by cooling.

LoganDark
2 replies
6h34m

Huh, never considered cooling. I suppose that contributes to the device's incredible thinness. Generally thin-and-light has always been an incredible turnoff for me, but tech is finally starting to catch up to thicker devices.

hedora
1 replies
5h6m

Thin and light is easier to cool. The entire device is a big heat sink fin. Put another way, as the device gets thinner, the ratio of surface area to volume goes to infinity.

If you want to go thicker, then you have to screw around with heat pipes, fans, etc, etc, to move the heat a few cm to the outside surface of the device.

LoganDark
0 replies
2h16m

That's not why thin-and-light bothers me. Historically, ultrabooks and similarly thin-and-light focused devices have been utterly insufferable in terms of performance compared to something that's even a single cm thicker. But Apple Silicon seems extremely promising, it seems quite competitive with thicker and heavier devices.

I never understood why everyone [looking at PC laptop manufacturers] took thin-and-light to such an extreme that their machines became basically useless. Now Apple is releasing thin-and-light machines that are incredibly powerful, and that is genuinely innovative. I hadn't seen something like that from them since the launch of the original iPhone, that's how big I think this was.

Takennickname
2 replies
5h55m

That's bullshit. Does that mean they could have doubled battery life if they kept the performance the same? Impossible.

throwaway11460
1 replies
5h53m

Impossible why? That's what happened with M1 too.

But as someone else noted, CPU power draw is not the only factor in device battery life. A major one, but not the whole story.

Takennickname
0 replies
3h36m

Intel to M1 is an entire architectural switch where even old software couldn't be run and had to be emulated.

This is a small generational upgrade that doesn't necessitate an event.

Other companies started having events like this because they were copying apples amazing events. Apples events now are just parodies of what Apple was.

antipaul
5 replies
6h47m

Video showing Apple Pencil Pro features was pretty sick, and I ain't even an artist

gardaani
3 replies
6h34m

I think they are over-engineering it. I have never liked gestures because it's difficult to discover something you can't see. A button would have been better than an invisible squeeze gesture.

swozey
2 replies
6h22m

I used Android phones forever until the iphone 13 came out and I switched to IOS because I had to de-Google my life completely after they (for no reason at all, "fraud" that I did not commit) blocked my Google Play account.

The amount of things I have to google to use the phone how I normally used Android is crazy. So many gestures required with NOTHING telling you how to use them.

I recently sat around a table with 5 of my friends trying to figure out how to do that "Tap to share contact info" thing. Nobody at the table, all long term IOS users, knew how to do it. I thought that if we tapped the phones together it would give me some popup on how to finish the process. We tried all sorts of tapping/phone versions until we realized we had to unlock both phones.

And one of the people there with the same phone as me (13 pro) couldn't get it to work at all. It just did nothing.

And the keyboard. My god is the keyboard awful. I have never typoed so much, and I have no idea how to copy a URL out of Safari to send to someone without using the annoying Share button which doesn't even have the app I share to the most without clicking the More.. button to show all my apps. Holding my finger over the URL doesn't give me a copy option or anything, and changing the URL with their highlight/delete system is terrible. I get so frustrated with it and mostly just give up. The cursor NEVER lands where I want it to land and almost always highlights an entire word when I want to make a one letter typo fix. I don't have big fingers at all. Changing a long URL that goes past the length of the Safari address bar is a nightmare.

I'm sure (maybe?) that's some option I need to change but I don't even feel like looking into it anymore. I've given up on learning about the phones hidden gestures and just use it probably 1/10th of how I could.

Carplay, Messages and the easy-to-connect-devices ecosystem is the only thing keeping me on it.

antipaul
0 replies
2h48m

To copy URL from Safari, press and hold (long-press) on URL bar

Press and hold also allows options elsewhere in iOS

Apocryphon
0 replies
6h11m

Sounds like the ideal use case for an AI assistant. Siri ought to tell you how to access hidden features on the device. iOS, assist thyself.

Takennickname
0 replies
6h45m

The highlight of the event was a stylus?

asow92
14 replies
4h58m

Why are we running these high end CPUs on tablets without the ability to run pro apps like Xcode?

Until I can run Xcode on an iPad (not Swift Playgrounds), it's a pass for me. Hear me out: I don't want to bring both an iPad and Macbook on trips, but I need Xcode. Because of this, I have to pick the Macbook every time. I want an iPad, but the iPad doesn't want me.

MuffinFlavored
2 replies
3h16m

I don't want to bring both an iPad and Macbook on trips, but I need ______

Why not just make the iPad run MacOS and throw iPadOS into the garbage?

asow92
1 replies
2h28m

I like some UX aspects of iPadOS, but need the functionality of macOS for work.

reddalo
0 replies
45m

iPadOS is still mainly a fork of iOS, a glorified mobile interface. They should really switch to a proper macOS system, now that the specs allow for it.

timmg
1 replies
1h38m

Just wait until you buy an Apple Vision Pro...

[It's got the same restrictions as an iPad, but costs more than a MacBookPro.]

gpm
0 replies
31m

This is in fact the thing that stopped me from buying an Apple Vision Pro.

elpakal
1 replies
4h38m

"It's not you, it's me" - Xcode to the iPad

asow92
0 replies
4h34m

In all seriousness, you're right. Sandboxing Xcode but making it fully featured is surely a nightmare engineering problem for Apple. However, I feel like some kind of containerized macOS running in the app sandbox could be possible.

al_borland
1 replies
4h36m

WWDC is a month away. I’m hoping for some iPadOS updates to let people actually take advantage of the power they put in these tablets. Apple has often released new hardware before showing off new OS features to take advantage of it.

I know people have been hoping for that for a long time, so I’m not holding my breath.

asow92
0 replies
2h9m

My guess is that WWDC will be more focused on AI this year, but I will remain hopeful.

Naomarik
1 replies
3h56m

Didn't have to look long to find a comment mirroring how I feel about these devices. To me it feels like they're just adding power to an artificially castrated device I can barely do anything with. See no reason to upgrade from my original iPad Pro that's not really useful for anything. Just an overpowered device running phone software.

asow92
0 replies
3h29m

I feel the same way. I just can't justify upgrading from my 10.5" Pro from years ago. It's got pro motion and runs most apps fine. Sure, the battery isn't great after all these years, but it's not like it's getting used long enough to notice.

smrtinsert
0 replies
4h27m

Yep I have also no use for a touch screen device of that size. Happy to get an m4 mac air or whatever it will be called but I'm done with pads.

kylehotchkiss
0 replies
3h6m

Visual studio code running from remote servers seemed like it was making great progress right until the AI trendiness thing took over... and hasn't seemed to advance much since. Hopefully the AI thing cools down and the efforts on remote tooling/dev environments continues onwards.

kjkjadksj
0 replies
4h9m

Didn’t you want to play a reskinned bejewelled or subway surfer with 8k textures?

Lalabadie
10 replies
6h45m

Key values in the press release:

- Up to 1.5x the CPU speed of iPad Pro's previous M2 chip

- Octane gets up to 4x the speed compared to M2

- At comparable performance, M4 consumes half the power of M2

- High-performance AI engine, that claims 60x the speed of Apple's first engine (A11 Bionic)

stonemetal12
5 replies
6h2m

Apple claimed M3 was 1.35 the speed of the M2. So the M3 vs M4 comparison isn't that impressive. Certainly not bad by any means, just pointing out why it is compared to the M2 here.

SkyPuncher
1 replies
4h39m

While I completely agree with your point, the M chips is a series of chips. The iPad M2 is different than the MBP M2 or the MacBook Air.

It’s all just marketing to build hype.

astrange
0 replies
3h4m

No, it's the same as the MB Air.

tiltowait
0 replies
3h51m

It seems pretty reasonable to compare it against the last-model iPad, which it's replacing.

kjkjadksj
0 replies
3h58m

People make this comment after every single m series release. Its true for intel too, worse even. Changes between like 8th and 9th and 10th gen were like nill, small clock bump same igpu even.

frankchn
0 replies
3h53m

The other reason it is compared to the M2 is that there are no iPads with M3s in them, so it makes sense to compare to the processor used in the previous generation product.

codedokode
1 replies
6h22m

Up to 1.5x the CPU speed

Doesn't it mean "1.5x speed in rare specific tasks which were hardware optimized, and 1x ewerywhere else"?

rsynnott
0 replies
6h12m

I mean, we'll have to wait for proper benchmarks, but that would make it a regression vs the M3, so, er, unlikely.

mrtesthah
0 replies
6h35m

- Up to 1.5x the CPU speed of iPad Pro's previous M2 chip

What I want to know is whether that ratio holds for single-core performance measurements.

BurningFrog
0 replies
6h22m

At this point I read "up to" as "not"...

snapcaster
9 replies
6h45m

Apple was really losing me with the last generation of intel macbooks but these m class processors are so good they've got me locked in all over again

atonse
8 replies
6h16m

The M1 Max that I have is easily the greatest laptop I've ever owned.

It is fast and handles everything I've ever thrown at it (I got 32 GB RAM), it never, ever gets hot, I've never heard a fan in 2+ years (maybe a very soft fan if you put your ear next to it). And the battery life is so incredible that I often use it unplugged.

It's just been a no-compromise machine. And I was thinking of upgrading to an M3 but will probably upgrade to an M4 instead at the end of this year when the M4 maxes come out.

teaearlgraycold
4 replies
5h20m

Now that there's the 13 inch iPad I am praying they remove the display notch on the Macbooks. It's a little wacky when you've intentionally cut a hole out of your laptop screen just to make it look like your phones did 2 generations ago and now you sell a tablet with the same screen size without that hole.

tiltowait
3 replies
3h49m

I really hate the notch[0], but I do like that the screen stretches into the top that would otherwise be entry. It's unsightly, but we did gain from it.

[0] Many people report that they stop noticing the notch pretty quickly, but that's never been the case for me. It's a constant eyesore.

teaearlgraycold
1 replies
2h44m

What I've done is use a wallpaper that is black at the top. On the MBP's OLED screen that means the black bezel perfectly blends into the now black menu bar. It's pretty much a perfect solution but the problem it's solving is ridiculous IMO.

doctor_eval
0 replies
25m

I do the same, I can’t see the notch and got a surprise the other day when my mouse cursor disappeared for a moment.

I don’t get the hate for the notch tho. The way I see it, they pushed the menus out of the screen and up into their own dedicated little area. We get more room for content.

It’s like the touchbar for menus. Oh, ok, now I know why people hate it. /jk

kjkjadksj
0 replies
3h19m

A big issue with the thin bezels I am now noticing is you lose what used to be a buffer for fingerprints from opening the lid.

kjkjadksj
1 replies
3h21m

Thats surprising you haven’t heard the fans. Must be the use case. There’s a few games that will get it quite hot and spool up the fans. I have also noticed its got somewhat poor sleep management and remains hot while asleep. Sometimes I pick up the computer for the first time that day and its already very hot from whatever kept it out of sleep with a shut lid all night.

artificialLimbs
0 replies
56m

Not sure what app you’ve installed to make it do that, but I’ve only experienced the opposite. Every Windows 10 laptop I’ve owned (4 of them) would never go to sleep and turn my bag into an oven if I forgot to manually shut down instead of closing the lid. Whereas my M1 MBP has successfully gone to sleep every lid close.

grumpyprole
0 replies
6h2m

Unlike the PC industry, Apple is/was able to move their entire ecosystem to a completely different architecture, essentially one developed exactly for low power use. Windows on ARM efforts will for the foreseeable future be plagued by application support and driver support. It's a great shame, as Intel hardware is no longer competitive for mobile devices.

mlhpdx
9 replies
6h47m

I’m far from an expert in Apple silicon, but this strikes me as having some conservative improvements. And in-depth info out there yet?

gmm1990
2 replies
6h40m

It surprised me they called it an M4 vs an M3 something. The display engine seems to be the largest change I don't know what that looked like on previous processors. Completely hypothesizing but could be a significant efficiency improvement if its offloading display stuff.

duxup
0 replies
6h14m

I'd rather they just keep counting up than some companies where they get into wonky product line naming convention hell.

It's ok if 3 to 4 is or isn't a big jump, it's the next one is really all I want to know. If I need to peek at the specs, the name really won't tell me anything anyhow and I'll be on a webpage.

aeonik
0 replies
6h22m

3 isn't a power of two, maybe the M8 is next.

Lalabadie
2 replies
6h43m

2x the performance per watt is a great improvement, though.

refulgentis
0 replies
6h40m

Clever wording on their part: 2x performance per watt over M2. Took me a minute, had to reason through this is their 2nd generation 3nm chip, so it wasn't from a die shrink, then go spelunking.

jeffbee
0 replies
6h23m

This claim can only be evaluated in the context of a specific operating point. I can 6x the performance per watt of the CPU in this machine I am using by running everything on the efficiency cores and clocking them down to 1100MHz. But performance per watt is not the only metric of interest.

Findecanor
1 replies
5h27m

I'm guessing that the "ML accelerator" in the CPU cores means one of ARM's SME extensions for matrix multiplication. SME in ARM v8.4-A adds dot product instructions. v8.6-A adds more, including BF16 support.

https://community.arm.com/arm-community-blogs/b/architecture...

hmottestad
0 replies
35m

Apple has the NPU (also called Apple Neural Engine), which is specific hardware for running inference. Can't be used for LLMs though at the moment, maybe the M4 will be different. They also have a vector processor attached to the performance cluster of the CPU, they call the instruction set for it AMX. I believe that that one can be leveraged for faster LLM inferencing.

https://github.com/corsix/amx

shepherdjerred
0 replies
6h14m

I expect the pro/max variants will be more interesting. The improvements do look great for consumer devices, though.

alberth
3 replies
6h18m

Especially about Gurman, who he loves to hate on.

atommclain
1 replies
5h36m

Never understood the animosity, especially because it seems to only go one direction.

tambourine_man
0 replies
4h59m

He spills Apple's secrets. Gruber had him on his podcast once and called him a super villain in the Apple's universe, or something like this. It was cringeworthy

MBCook
0 replies
4h59m

As a longtime reader/listener I don’t see him as hating Gurman at all.

TillE
2 replies
5h34m

or Apple’s silicon game is racing far ahead of what I considered possible

Gruber's strange assumption here is that a new number means some major improvements. Apple has never really been consistent about sticking to patterns in product releases.

tiffanyh
1 replies
5h11m

This is a major improvement (over the M3).

It's on a new fab node size.

It also have more CPU cores than it's predessor (M3 with 8-core vs M4 with 10-cores).

edward28
0 replies
10m

It's on TSMC n3E which is a slightly less dense but better yielding than the previous n3B.

bombcar
1 replies
5h37m

Wasn't it relatively well known that the M3 is on an expensive process and quickly getting to an M4 on a cheaper/higher yield process would be worth it?

MBCook
0 replies
5h0m

Yes but Apple has never gone iPad first on a new chip either, so I was with him in that I assumed it wouldn’t be what they would do.

“Let’s make all our Macs look slower for a while!”

So I was surprised as well.

lopkeny12ko
7 replies
6h38m

I always wonder how constraining it is to design these chips subject to thermal and energy limitations. I paid a lot of money for my hardware and I want it to go as fast as possible. I don't want my fans to be quiet, and I don't want my battery life to be 30 minutes longer, if it means I get more raw performance in return. But instead, Apple's engineers have unilaterally decided to handicap their own processors for no real good reason.

boplicity
1 replies
6h32m

Why not go with a Windows based device? There are many loud and low-battery life options that are very fast.

jwells89
0 replies
6h16m

Yeah, one of my biggest frustrations as a person who likes keeping around both recent-ish Mac and Windows/Linux laptops is that x86 laptop manufacturers seem to have a severe allergy to building laptops that are good all-rounders… they always have one or multiple specs that are terrible, usually heat, fan noise, and battery life.

Paradoxically this effect is the worst in ultraportables, where the norm is to cram in CPUs that run too hot for the chassis with tiny batteries, making them weirdly bad at the one thing they’re supposed to be good at. Portability isn’t just physical size and weight, but also runtime and if one needs to bring cables and chargers.

On that note, Apple really needs to resurrect the 12” MacBook with an M-series or even A-series SoC. There’d be absolutely nothing remotely comparable in the x86 ultraportable market.

shepherdjerred
0 replies
6h10m

You're in the minority

jupp0r
0 replies
6h13m

Thermal load has been a major limiting design factor in high end CPU design for two decades (remember Pentium 4?).

Apart from that, I think you might me in a minority if you want a loud, hot iPad with a heavy battery to power all of this (for a short time, because physics). There are plenty of Windows devices that work exactly like that though if that's really what makes you happy. Just don't expect great performance either, because of diminishing returns of using higher power and also because the chips in these devices usually suck.

etchalon
0 replies
6h24m

The reason is because battery life is more important to the vast majority of consumers.

Workaccount2
0 replies
6h34m

The overwhelming majority of people who buy these devices will just use them to watch netflix and tiktok. Apple is well aware of this.

Perceval
0 replies
5h35m

Most of what Apple sells goes into mobile devices: phone, tablet, laptop. In their prior incarnation, they ran up real hard against the thermal limits of what they could put in their laptops with the IBM PowerPC G5 chip.

Pure compute power has never been Apple's center of gravity when selling products. The Mac Pro and the XServe are/were minuscule portions of Apple's sales, and the latter product was killed after a short while.

Apple's engineers have unilaterally decided to handicap their own processors for no real good reason

This is a misunderstanding of what the limiting factor is of Apple products' capability. The mobile devices all have battery as the limfac. The processors being energy efficient in compute-per-watt isn't a handicap, it's an enabler. And it's a very good reason.

PreInternet01
7 replies
6h45m

M4 makes the new iPad Pro an outrageously powerful device for artificial intelligence

Yeah, well, I'm an enthusiastic M3 user, and I'm sure the new AI capabilities are nice, but hyperbole like this is just asking for snark like "my RTX4090 would like a word".

Other than that: looking forward to when/how this chipset will be available in Macbooks!

mewpmewp2
4 replies
6h36m

Although as I undserstand m3 chips with more VRAM handle larger LLMs better because they can load more into VRAM compared to 4090.

freedomben
1 replies
6h31m

This is true, but that is only an advantage when running a model larger than the VRAM. If your models are smaller, you'll get substantially better performance in a 4090. So it all comes down to which models you want to run.

mewpmewp2
0 replies
6h19m

It seems like 13b was running fine on 4090, but when I tried all the more fun or intelligent ones became very slow and would have peformed better on m3.

PreInternet01
1 replies
6h2m

Yes, M3 chips are available with 36GB unified RAM when embedded in a MacBook, although 18GB and below are the norm for most models.

And even though the Apple press release does not even mention memory capacity, I can guarantee you that it will be even less than that on an iPad (simply because RAM is very battery-hungry and most consumers won't care).

So, therefore my remark: it will be interesting to see how this chipset lands in MacBooks.

mewpmewp2
0 replies
1h43m

But M3 Max should able to support up to 128gb.

SushiHippie
1 replies
6h33m

Disclosure: I personally don't own any apple devices, except a work laptop with an M2 chip

I think a comparison to the 4090 is unfair, as there is no laptop/tablet with an rtx 4090 and the power consumption of a 4090 is at ~450W on average

PreInternet01
0 replies
6h10m

I think a comparison to the 4090 is unfair

No, when using wording like "outrageously powerful", that's exactly the comparison you elicit.

I'd be fine with "best in class" or even "unbeatable performance per Watt", but I can absolutely guarantee you that an iPad does not outperform any current popular-with-the-ML-crowd GPUs...

grzeshru
6 replies
6h27m

Are these M-class chips available to be purchased on Digi-Key and Mouser? Do they have data sheets and recommended circuitry? I’d love to play with one just to see how difficult it is to integrate compared to, say, an stm8/32 or something.

exabrial
1 replies
6h17m

absolutely not, and even if they were, they are not documented in the least bit and require extraordinary custom OS and other BLOBs to run

grzeshru
0 replies
5h54m

Darn it. Oh well.

downrightmike
1 replies
6h7m

lol

metaltyphoon
0 replies
4h2m

Legit made me chuckle

culopatin
1 replies
4h14m

Did you really expect a yes?

grzeshru
0 replies
3h45m

I didn’t know what to expect. I thought they may license it to other companies under particular clauses or some such.

daft_pink
6 replies
6h19m

Who would buy a MacBook Air or mini or studio today with its older chips?

antonkochubey
3 replies
5h57m

Someone who needs a MacBook Air or mini or studio, not an iPad

daft_pink
2 replies
5h50m

I'm just venting that their processor strategy doesn't make much sense. The iPad gets the M4, but the Mini and Studio and Mac Pro are still on M2 and the MacBooks are on M3.

They've essentially undercut every Mac they currently sell by putting the M4 in the iPad and most people will never use that kind of power in an iPad.

If you are going to spend $4k on a Mac don't you expect it to have the latest processor?

victorbjorklund
0 replies
1h55m

People who care about having the latest probably are waiting already anyway.

lotsofpulp
0 replies
1h16m

Probably 80%+ of the population can do everything they need or want to do for the next 5 (maybe even 8) years on an M2 Air available for less than $1,500.

I write this on a $1,000 late 2015 Intel MacBook Air.

rc_mob
0 replies
6h14m

people on a budget

alexpc201
0 replies
6h6m

People with a Macbook. You use the Macbook to work and the iPad to play, read, movies, draw, etc. plus you can use it as a second monitor for the Macbook.

stjo
5 replies
6h49m

Only in the new iPads though, no word when it'll be available in Macs.

speg
3 replies
6h47m

In the video event, Tim mentions more updates at WWDC next month - I suspect we will see a M4 MacBook Pro then.

atonse
2 replies
6h14m

Haven't they been announcing Pros and Max's around December? I don't remember. If they're debuting them at WWDC I'll definitely upgrade my M1. I don't even feel the need to, but it's been 2.5 years.

hmottestad
0 replies
33m

Mac Studio with M4 Ultra. Then M4 Pro and Max later in the year.

Foobar8568
0 replies
5h56m

November 2023 for the M3 refresh, M2 was January 2023 if I remember well.

throwaway5959
0 replies
6h38m

Hopefully in the Mac Mini at WWDC.

sroussey
5 replies
6h46m

“The next-generation cores feature improved branch prediction, with wider decode and execution engines for the performance cores, and a deeper execution engine for the efficiency cores. And both types of cores also feature enhanced, next-generation ML accelerators.”

gwd
1 replies
6h38m

I wonder to what degree this also implies, "More opportunities for speculative execution attacks".

sroussey
0 replies
6h34m

I thought the same thing when I read it, but considering the attacks were known when doing this improvement, I hope that it was under consideration during the design.

anentropic
1 replies
6h31m

I wish there was more detail about the Neural Engine updates

sroussey
0 replies
1h56m

Agreed. I think they just doubled the number of cores and called it a day, but who knows.

a_vanderbilt
0 replies
6h23m

Fat decode pipelines have historically been a major reason for their performance lead. I'm all for improvements in that area.

marinhero
5 replies
5h2m

I get frustrated seeing this go into the iPad and knowing that we can't get a shell, and run our own binaries there. Not even as a VM like [UserLAnd](https://userland.tech). I could effectively travel with one device less in my backpack but instead I have to carry two M chips, two displays, batteries, and so on...

It's great to see this tech moving forward but it's frustrating to not see it translate into a more significant impact in the ways we work, travel and develop software.

bschmidt1
2 replies
4h36m

Think the play is "consumer AI". Would you really write code on an iPad? And if you do, do you use an external keyboard?

e44858
1 replies
3h57m

Tablets are the perfect form factor for coding because you can easily mount them in an ergonomic position like this: https://mgsloan.com/posts/comfortable-airplane-computing/

Most laptops have terrible keyboards so I'd be using an external one either way.

bschmidt1
0 replies
3h25m

Those keyboards are absolutely ridiculous, sorry.

LeoPanthera
1 replies
3h7m

UTM can be built for iOS.

zamadatix
0 replies
1h55m

Hypervisor.framework is not exposed without a jailbreak which makes this quite limited in terms of usability and functionality.

daniel31x13
5 replies
6h38m

Well at the end of the day the processors are bottlenecked by its OS. What real value does an iPad bring that a typical iPhone + Mac combo misses? (Other than being a digital notebook…)

cooper_ganglia
1 replies
5h17m

Digital artist's can get a lot of use out of it, I'd assume. The Apple Pencil seems pretty nice with the iPad.

daniel31x13
0 replies
3h16m

This. If you’re anything other than a digital artist/someone who genuinely prefers writing over typing, an iPad is just an extra tool for you to waste your money on.

I had one of the earlier versions and this was pretty much its only use case…

tiltowait
0 replies
3h47m

I often prefer (as in enjoy) using my iPad Pro over my 16" M1 MBP, but I think the only thing my iPad is actually better for is drawing.

hot_gril
0 replies
3h53m

My wife has a new iPad for grad school, and I'm convinced it's mainly an extra category for some customers to spend more money on if they already have a Mac and iPhone. The school supplied it, then she spent $400+ on the keyboard and other damn dongles to bring the hardware sorta up to par with a laptop, hoping to replace her 2013 MBP.

In the end, she still has to rely on the MBP daily because there's always something the iPad can't do. Usually something small like a website not fully working on it.

JohnBooty
0 replies
5h1m

I wound up getting a 2019 iPad Pro for 50% off, so $500 or so. Thought I would use it as a work/play hybrid.

Surprisingly (at least to me) I feel that I've more than gotten my money's worth out of it despite it being almost entirely strictly a consumption device.

I tote it around the house so I can watch or listen to things while I'm doing other things. It's also nice to keep on the dining room table so I can read the news or watch something while we're eating. I could do every single one of these things with my laptop, but... that laptop is my primary work tool. I don't like to carry it all over the place, exposing it to spills and dust, etc.

The only real work-related task is serving as a secondary monitor (via AirPlay) for my laptop when I travel.

$500 isn't pocket change, but I've gotten 48 months of enjoyment and would expect at least another 24 to 36 months. That's about $6 a month, or possibly more like $3-4 per month if I resell it eventually.

Worth it for me.

shepherdjerred
4 replies
6h18m

Am I wrong, or is raytracing on an iPad an _insane_ thing to announce? As far as I know, raytracing is the holy grail of computer graphics.

It's something that became viable on consumer gaming desktops just a few years ago, and now we have real-time ray tracing on a tablet.

vvvvvvvvvvvvv
0 replies
5h40m

iPhones with A17 already have hardware ray tracing. Few applications/games support it at present.

pshc
0 replies
4h14m

It is kind of crazy to look back on. In the future we might look forward to path tracing and more physically accurate renderers. (Or perhaps all the lighting will be hallucinated by AI...?)

luyu_wu
0 replies
5h2m

Why would it be? They announced the same for the A17 in the iPhone. Turns out it was a gimmick that caused over 11W of power draw. Raytracing is a brute force approach that cannot be optimized to the same level as rasterization. For now at least, it is unsuitable for mobile devices. Now if we could use the RT units for Blender that'd be great, but it's iPad OS...

bluescrn
0 replies
5h19m

Gotta make the loot boxes look even shinier, keep the gamblers swiping those cards.

troupo
3 replies
6h20m

"The M4 is so fast, it'll probably finish your Final Cut export before you accidentally switch apps and remember that that cancels the export entirely. That's the amazing power performance lead that Apple Silicon provides." #AppleEvent

https://mastodon.social/@tolmasky/112400245162436195

LeoPanthera
1 replies
3h3m

This is a straight-up lie, yes? Switching apps doesn't cancel the export.

troupo
0 replies
43m

Can neither confirm nor deny :) I've seen people complain about this on Twitter and Mastodon though.

It's possible people are running into iOS limitations: it will kill apps when it thinks there's not enough memory.

dlivingston
0 replies
5h43m

Ha. That really highlights how absurd the toy of iPadOS is compared to the beasts that are the M-series chips.

It's like putting a Ferrari engine inside of a Little Tikes toy car. I really have no idea who the target market for this device is.

tosh
3 replies
6h48m

Did they mention anything about RAM?

tosh
1 replies
6h47m

faster memory bandwidth
zamadatix
0 replies
6h24m

The announcement video also highlighted "120 GB/s unified memory bandwidth". 8 GB/16 GB depending on model.

dmitshur
0 replies
6h16m

I don't think they included it in the video, but https://www.apple.com/ipad-pro/specs/ says it's 8 GB of RAM in 256/512 GB models, 16 GB RAM in 1/2 TB ones.

pier25
3 replies
6h38m

This is great but why even bother with the M3?

The M3 Macs were released only 7 months ago.

ls612
1 replies
6h30m

Probably they had some contractual commitments with TSMC and had to use up their N3B capacity somehow. But as soon as N3E became available it’s a much better process overall.

a_vanderbilt
0 replies
6h20m

Ramping up production on a new die also takes time. The lower volume and requirements of the M4 as used in the iPad can give them time to mature the line for the Macs.

SkyPuncher
0 replies
4h32m

So far, I haven’t seen any comparison between the iPad M4 and the computer M3. Everything was essentially compared to the last iPad chip, the M2.

Your laptop M3 chip is still probably more powerful than this. The laptop M4 will be faster, but not groundbreaking faster.

paxys
3 replies
6h45m

They keep making iPads more powerful while keeping them on the Fisher-Price OS and then wonder why no one is buying them for real work.

Who in their right mind will spend $1300-$1600 on this rather than a MacBook Pro?

wizzwizz4
0 replies
6h41m

Everyone knows the Fisher-Price OS is Windows XP (aka Teletubbies OS, on account of Bliss). iPads run Countertop OS (on account of their flat design).

throwaway2562
0 replies
6h44m

This.

hedora
0 replies
6h33m

You can install linux on the one with a fixed hinge and keyboard, but without a touchscreen. It’s the “book” line instead of the “pad” line.

I’m also annoyed that the iPad is locked down, even though it could clearly support everything the macbook does.

Why can’t we have a keyboard shortcut to switch between the iPad and Mac desktops or something?

999900000999
3 replies
6h30m

I'm extremely happy with my M1 iPad.

The only real issue is aside from the screen eventually wearing out ( it already has a bit of flex), I can't imagine a reason to upgrade. It's powerful enough to do anything you'd use an iPad for. I primarily make music on mine, I've made full songs with vocals and everything ( although without any mastering - I think this is possible in Logic on iPad).

It's really fun for quick jam sessions, but I can't imagine what else I'd do with it. IO is really bad for media creation, you have a single USB C port( this bothers me the most, the moment that port dies it becomes E Waste), no headphone jack...

MuffinFlavored
1 replies
6h24m

Any apps that work with MIDI controller on iPad?

Also, can't you just use a USB-C hub for like $10 from Amazon?

999900000999
0 replies
6h14m

I have more USB hubs than I can count.

You still only have one point of failure for the entire device that can't be easily fixed.

And most midi controllers work fine via USB or Bluetooth

tootie
0 replies
5h18m

I have an iPad that predates M1 and it's also fine. It's a media consumption device and that's about it.

winwang
2 replies
2h17m

I would want to know if the LPDDR6 rumors are substantiated, i.e. memory bus details.

https://forums.macrumors.com/threads/lpddr6-new-beginning-of...

If M4 Max could finally break the 400GBps limit of the past few years and hit 600 GBps, it would be huge for local AI since it could directly translate into inference speedups.

hmottestad
1 replies
42m

The M4 has increased the bandwidth from 100 to 120 GB/s. The M4 Max would probably be 4x that at 480 GB/s, but the M4 Ultra would be 960 GB/s compared to M2 Ultra at 800 GB/s.

winwang
0 replies
8m

Dang. +20% is still a nice difference, but not sure how they did that. Here's hoping M4 Max can include more tech, but that's copium.

960 GB/s is 3090 level so that's pretty good. I'm curious if the Macbooks right now are actually more so compute limited due to tensor throughput being relatively weak, not sure about real-world perf.

visarga
2 replies
4h37m

LLaMA 3 tokens/second please, that's what we care about.

mlboss
0 replies
16m

Only spec that I care about

bschmidt1
0 replies
4h36m

Hahah yes

tibbydudeza
2 replies
6h25m

4P cores only ???.

antonkochubey
0 replies
5h55m

It's a tablet/ultrabook chip, are you expecting an Threadripper in them?

a_vanderbilt
0 replies
6h18m

I'm hoping the higher efficiency gains and improved thermals offset that. The efficiency cores tend to have more impact on the Macs where multitasking is heavier.

therealmarv
2 replies
4h59m

So why should I buy any Apple Laptop with M3 chip now (if I'm not in hurry)? lol

_ph_
0 replies
22m

If you are not in a hurry, you almost never should buy new hardware as the next generation will be around the corner. On the other side, it could be up to 12 months, until the M4 is available across the line. And for most tasks, a M3 is a great value too. One might watch how many AI features that would benefit from a M4 are presented at WWDC. But then, the next Mac OS release won't be out before October.

MBCook
0 replies
4h58m

That’s why a lot of people weren’t expecting this an even questioned Mark Gurman’s article saying it would happen.

rvalue
2 replies
5h45m

No mention of battery life. They keep making stuff thin, and unupgradeable. What's the point of buying an apple device that is going to wear out in 5 years?

namdnay
0 replies
5h44m

To be fair every MacBook I’ve had has lasted 10 years minimum

TillE
0 replies
5h32m

Battery replacement costs $200. It's not like you just have to throw it in the trash if the battery dies.

qwertyuiop_
2 replies
6h21m

Does anyone know how much of this giant leap performance as Apple puts it is really useful and perceived by end users of iPad. I am thinking gaming, art applications on iPad. What other major ipad use cases are out there that need this kind of performance boost.

musictubes
0 replies
5h31m

Making music. The iPad is much better for performing than a computer. There is a huge range of instruments, effects, sequencers, etc. available on the iPad. Things like physical modeling and chained reverb can eat up processor cycles so more performance is always welcomed.

Both Final Cut Pro and Davinci resolve can also use as much power as you can give them though it isn’t clear to me why you’d use an iPad instead of a Mac. They also announced a crazy multicam app for iPads and iPhones that allows remote control of a bunch of iPhones at the same time.

bschmidt1
0 replies
4h38m

I imagine running LLMs and other AI models to produce a variety of art, music, video, etc.

onetimeuse92304
2 replies
6h29m

As an amateur EE it is so annoying that they reuse names of already existing ARM chips.

ARM Cortex-M4 or simply M4 is quite popular ARM architecture. I am using M0, M3 and M4 chips from ST on a daily basis.

zerohp
0 replies
4h39m

As a professional EE, I know that ARM Cortex-M4 is not a chip. It's an embedded processor that is put into an SOC (which is a chip), such as the STM32-family from ST.

jupp0r
0 replies
6h20m

It's not like the practice of giving marketing names to chips is generally a world of logical sanity if you look at Intel i5/i7/i9 etc.

jshaqaw
2 replies
1h2m

The heck do I do with an M4 in an iPad? Scroll hacker news really really fast?

Apple needs to reinvest in software innovation on the iPad. I don't think my use case for it has evolved in 5 years.

hmottestad
1 replies
51m

I was hoping they would come out and say "and now developers can develop apps directly on their iPads with our new release of Xcode" but yeah, no. Don't know if the M4 with just 16GB of memory would be very comfortable for any pro workload.

doctor_eval
0 replies
36m

There’s no way Apple would announce major new OS features outside of WWDC.

So, perhaps it’s no coincidence that a new iPadOS will be announced in exactly one month.

Here’s hoping anyway!

gavin_gee
2 replies
5h46m

I'm still rocking an iPad 6th generation. It's a video consumption device only. A faster CPU doesn't enable any new use cases.

The only reason is the consumer's desire to buy more.

nortonham
0 replies
5h2m

and I still have a an original iPad Air first gen.....still works for basic things. Unsupported by apple now, but still usable.

cyberpunk
0 replies
5h14m

New oled does look like quite a nice display though...

treesciencebot
1 replies
6h25m

~38 TOPS at fp16 is amazing, if the quoted number if fp16 (ANE is fp16 according to this [1] but that honestly seems like a bad choice when people are going smaller and smaller even at the higher level datacenter cards so not sure why apple would use it instead of fp8 natively)

[1]: https://github.com/hollance/neural-engine/blob/master/docs/1...

imtringued
0 replies
5h7m

For reference. The llama.cpp people are not going smaller. Most of those models run on 32 bit floats with the dequantization happening on the fly.

tiffanyh
1 replies
6h7m

Nano-Texture

I really hope this comes to all Apple products soon (iPhones, all iPads, etc).

It's some of the best anti-reflective tech I've seen that keeps color and brightness deep & bright.

kylehotchkiss
0 replies
3h4m

Will be interesting to see how it holds up on devices that get fingerprints and could be scratched though. Sort of wish Apple would offer it as a replaceable screen film.

pxc
1 replies
6h37m

Given that recent Apple laptops already have solid all-day battery life, with such a big performance per watt improvement, I wonder if they'll end up reducing how much battery any laptops ship with to make them lighter.

asadotzler
0 replies
25m

No, because battery life isn't just about the CPU. The CPU sits idle most of the time and when it's not idle, it's at workloads like 20% or whatever. It's the screens that eat batteries because they're on most or all of the time and sucking juice. Look at Apple's docs and you'll see the battery life is the exact same as the previous model. They have a battery budget and if they save 10% on CPU, they give that 10% to a better screen or something. They can't shrink the battery by half until they make screens twice as efficient, not CPUs which account for only a small fraction of power draw.

ionwake
1 replies
6h37m

Sorry to be a noob, but does anyone have a rough estimate of when this m4 chip will be in a macbook air or macbook pro?

a_vanderbilt
0 replies
6h26m

If I had to venture a guess, maybe WWDC '24 that's coming up.

fallingsquirrel
1 replies
6h43m

Is it just me or is there not a single performance chart here? Their previous CPU announcements have all had perf-per-watt charts, and that's conspicuously missing here. If this is an improvement over previous gens, wouldn't they want to show that off?

a_vanderbilt
0 replies
6h26m

Since Intel->M1 the performance gains haven't been the headliners they once were, although the uplifts haven't been terrible. It also lets them hide behind the more impressive sounding multiplier which can reference something more specific but not necessarily applicable to broader tasks.

eterevsky
1 replies
4h51m

They are talking about iPad Pro as the primary example of M4 devices. But iPads don't really seem to be limited by performance. Nobody I know compiles Chrome or does 3D renders on an iPad.

SkyPuncher
0 replies
4h43m

It’s all marketing toward people who aspire to be these creative types. Very, few people actually need it but it feels good when the iPad Air is missing a few key features that push you to the Pro.

More practically, it should help with battery life. My understanding is energy usage scales non-linearly with demand. A more powerful chip running at 10% may be more battery efficient than a less powerful chip running at 20%

diogenescynic
1 replies
6h29m

So is the iPad mini abandoned due to the profit margins being too small or what? I wish they'd just make it clear so I could upgrade without worrying a mini replacement will come out right after I buy something. And I don't really understand why there are so many different iPads now (Air/Pro/Standard). It just feels like Apple is slowly becoming like Dell... offer a bunch of SKUs and barely differentiated products. I liked when Apple had fewer products but they actually had a more distinct purpose.

downrightmike
0 replies
6h2m

They refresh it like every 3 years

Dowwie
1 replies
4h11m

Can anyone explain where the media engine resides and runs?

wmf
0 replies
4h5m

The whole iPad is basically one chip so... the media engine is in the M4. AFAIK it's a top-level core not part of the GPU but Marcan could correct me.

zincmaster
0 replies
3h3m

I own M1, A10X and A12X iPad Pros. I have yet to see any of them ever max out their processor or get slow. I have no idea why anyone would need an M4 one. Sure, it's because Apple no longer has M1s being fabbed at TSMC. But seriously, who would upgrade.

Put MacOS on iPad Pro, then it gets interesting. The most interesting thing my ipad pros do are look at security cameras or read ODB-II settings on my vehicle. Hell, they can't even maintain an SSH connection correctly. Ridiculous.

I see Apple always show videos of people editing video on their iPad Pro. Who does that??? We use them for watching videos (kids). One is in a car as a mapping system - that's a solid use case. One I gave my Dad and he did know what to do with it - so its collecting dust. And one lives in the kitchen doing recipes.

Functionally, a 4 year old Chromebook is 3x as useful as a new iPad Pro.

zenethian
0 replies
6h37m

This is pretty awesome. I wonder if it has a fix for the the GoFetch security flaw?

vivzkestrel
0 replies
4h8m

any benchmarks of how it stacks up to m1, m2 and m3?

tsunamifury
0 replies
5h4m

Its really saying something about how the tech sector has shifted due to the recent AI wave that Apple is announcing a chipset entirely apart from a product.

This has never happened to my knowledge in this companies history? I could be wrong though, even the G3/G4s were launched as PowerMacs.

spxneo
0 replies
2h19m

almost went for M2 128gb to run some local llamas

glad I held out. M4 is going to put downward pressure across all previous gen.

edit: nvm, AMD is coming out with twice the performance of M4 in two months or less. If the M2s become super cheap I will consider it but M4 came far too late. There's just way better alternatives now and very soon.

satertek
0 replies
6h19m

Are there enough cores to allow user switching?

rnikander
0 replies
6h5m

Any hope for a new iPhone SE? My 1st gen's battery is near dead.

radicaldreamer
0 replies
4h11m

Unfortunate that they got rid of the SIM card slot, Google Fi only supports physical sims for their "data only" sim feature.

phkahler
0 replies
2h41m

> And with AI features in iPadOS like Live Captions for real-time audio captions, and Visual Look Up, which identifies objects in video and photos, the new iPad Pro allows users to accomplish amazing AI tasks quickly and on device. iPad Pro with M4 can easily isolate a subject from its background throughout a 4K video in Final Cut Pro with just a tap, and can automatically create musical notation in real time in StaffPad by simply listening to someone play the piano. And inference workloads can be done efficiently and privately...

These are really great uses of AI hardware. All of them benefit the user, where many of the other companies doing AI are somehow trying to benefit themselves. AI as a feature vs AI as a service or hook.

oxqbldpxo
0 replies
6h13m

All this powerful hardware on a laptop computer is like driving a Ferrari at 40 mph. It is begging for better use. If apple ever releases an ai robot that's going to change everything. Long ways to go, but when it arrives, it will be chatgptx100.

obnauticus
0 replies
5h56m

Looks like their NPU (aka ANE) takes up about 1/3 of the die area of the GPU.

Would be interesting to see how much they’re _actually_ utilizing the NPU versus their GPU for AI workloads.

nojvek
0 replies
4h10m

I am awaiting the day when a trillion transistors will be put on a mobile device chewing 5W of peak power.

It's going to be a radical future.

noiv
0 replies
5h51m

I got somewhat accustomed to new outrageous specs every year, but reading near the end that by 2030 Apple plans to be 'carbon neutral across the entire manufacturing supply chain and life cycle of every product' makes me hope one day my devices are not just a SUV on the data highway.

lenerdenator
0 replies
3h54m

So long as it lets me play some of the less-intense 00's-10's era PC games in some sort of virtualization framework at decent framerates one day, and delivers great battery life as a backend web dev workstation-on-the-go the next, it's a good chip. The M2 Pro does.

leesec
0 replies
2h34m

Why are all the comparisons with the M2? Apple did this with the M3 -> M1 as well right?

jgiacjin
0 replies
54m

Is there an sdk to work on gaming with unity with ipad Pro m4?

hmottestad
0 replies
58m

120 GB/s memory bandwidth. The M4 Max will probably top out at 4x that and the M4 Ultra at 2x that again. The M4 Ultra will be very close to 1TB/s of bandwidth. That would put the M4 Ultra in line with the 4090.

Rumours are that the Mac Studio and Mac Pro will skip M3 and go straight to M4 at WWDC this summer, which would be very interesting. There has also been some talk about an M4 Extreme, but we've heard rumours about the M1 Extreme and M2 Extreme without any of those showing up.

gigatexal
0 replies
8m

lol I just got a m3 max and this chip does more than 2x tops that my NPU does

gigatexal
0 replies
4m

Anyone know if the ram multiples of the M4 are better than the M3? Example: could a base model M4 sport more than 24GB of ram?

exabrial
0 replies
6h14m

All I want is more memory bandwidth at lower latency. I've learnt that's the vast majority of felt responsiveness today. I could care less about AI and Neural Engine party tricks, stuff I might use once a day or week.

czbond
0 replies
4h59m

Any idea when M4 will be in a mac pro?

bschmidt1
0 replies
6h14m

Bring on AI art, music, & games!

bmurphy1976
0 replies
4h31m

I love these advances and I really want a new iPad but I can't stand the 10"+ form factor. When will the iPad Mini get a substantial update?

biscuit1v9
0 replies
1h5m

the latest chip delivering phenomenal performance to the all-new iPad Pro What a joke. They have M4 and they still run iOS? Why can't they run MacOS instead?

If you take it a bit deeper: if an iPad would have keyboard, mouse and MacOS → it would basically be a 10/12 inch macbook.

bigdict
0 replies
6h33m

38 TOPS in the Neural Engine comes dangerously close to the Microsoft requirement of 40 TOPS for "AI PCs".

api
0 replies
6h0m

Looks great. Now put it in a real computer. Such a waste to be in a jailed device that can’t run anything.

animatethrow
0 replies
1h57m

Only iPad Pro has M4? Once upon a time during the personal computer revolution in the 1980s, little more than a decade after man walked the moon, humans had sufficiently technologically developed that it was possible to compile and run programs on the computers we bought, whether the computer was Apple (I,II,III, Mac), PC, Commodore, Amiga, or whatever. But these old ways were lost to the mists of time. Is there any hope this ancient technology will be redeveloped for iPad Pro within the next 100 years? Specifically within Q4 of 2124, when Prime will finally offer deliveries to polar Mars colonies? I want to buy an iPad Pro M117 for my great-great-great-great-granddaughter but only if she can install a C++ 212X compiler on it.

amai
0 replies
4h34m

16GB RAM ought to be enough for anybody! (Tim Cook)

alexpc201
0 replies
6h18m

I understand that they have delayed the announcement of these iPads until the M4 is ready, otherwise there is nothing interesting to offer to those who have an iPad Pro M2. I don't see the convenience of having a MacBook M3 and an iPad M4. If I can't run Xcode on an iPad M4, the MacBook is the smartest option; it has a bigger screen, more memory, and if you complement it with an iPad Air, you don't miss out on anything.

adonese
0 replies
6h12m

imaging a device so powerful as this new ipad, yet so useless. it baffles me that we have this great hardware only the software bit is lacking

abhayhegde
0 replies
1h20m

What's the endgame with iPads though? I mainly use it for consumption, taking notes and jotting annotations on PDFs. Well, it's a significant companion for my work, but I cannot see if I've any reason to upgrade from iPad Air 5, especially given the incompatibility of the Pencil 2nd gen.

TheRealGL
0 replies
3h57m

Who wrote this? "A fourth of the power", what happened to a quarter of the power?

ProfessorZoom
0 replies
5h30m

Apple Pencil Pro...

Apple Pencil Ultra next?

Apple Pencil Ultra+

Apple Pencil Pro Ultra XDR+

JodieBenitez
0 replies
5h38m

And here I am with my MB Air M1 with no plan to upgrade whatsoever because I don't need to...

(yes, I understand this is about iPad, but I guess we'll see these M4 on the MB Air as well ?)

GalaxyNova
0 replies
1h25m

I hate how apple tends to make statements about their products without clear benchmarks.

FredPret
0 replies
4h43m

Why does a tablet have a camera bump!? Just take out the camera. And let me run VSCode and a terminal.

EugeneOZ
0 replies
5h42m

1Tb model = €2750.

For iPad, not an MBP laptop.