Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

seek3r

macrumors 68020
Aug 16, 2010
2,390
3,458
Now if only we got new xserves based on this. Hey, Apple, I know a way you can at least slightly defray costs on this chip by adding more markets…
 

asiga

macrumors 65816
Nov 4, 2012
1,051
1,368
Dual use design means that the components Apple designs can be used for multiple applications. It’s really not specific to the Ultra series.

The bigger point of what Apple might start doing with the M5 series is the use of SoIC packaging technology. Currently Apple uses monolithic SoC dies, meaning a variety of system components are crammed into a single square of silicon (CPU, GPU, NPU, memory controller, Secure Enclave, display controllers, USB/Thunderbolt controllers). For smaller chips like the base M this works fine, but as you start to scale it up the SoC dies get to be quite large. The Max is about as large as is practical to make a single die, hence why the Ultra is basically two Max chips stitched together. But this isn’t ideal because you’re wasting silicon space on those extra controllers you might not need (you don’t need two Secure Enclaves for example).

SoIC means Apple can fabricate all of those SoC components separately and merge them together into a smaller and more efficient package (that performs better than a monolithic die). It also means Apple can more easily customize its chips rather than having to tape out an entirely new die. This is where the “dual use design” comes in. Apple can take CPU, GPU, and NPU tile designs from its consumer chips and create custom packages that are ideal for AI servers (for example lots of CPU and NPU cores but skimping on GPU). Because everything is modular there’s more flexibility when it comes to custom use cases.
Thank you very much. I feel like I didn't go to class the day "dual-use design" was explained, because it's the first time I see it in the context of SoC design 😄
 

Cervisia

macrumors newbie
Jun 22, 2024
27
82
In what benchmark or scenario does an M1 Pro beat an M3 Pro? And I'm on your side regarding my dislike of how they handicapped the M3 Pro in hopes of selling more Max chips!
I'm gonna need to comb through my youtube history to find the video, an AI guy made benchmarks with M1 Pro, M3 Pro/Max and an nvidia card. The scenario is essentially anywhere a gpu would be used, because the main strength over cpus is their higher memory speeds and bandwith. That's why nvidia does the same with their current gen gaming gpus: reducing/holding back memory bus and vram size.
 

Cervisia

macrumors newbie
Jun 22, 2024
27
82
Hi Cervisia - are there stats on this? I would like to understand the memory bandwidth issue better because I need to upgrade my MBP to Apple M-Class chips from my current Mac-Intel Chip. I don't want an AI chip. But I also don't want a chip that has been intentionally degraded in terms of memory bandwidth just so that Apple can reduce production costs. (I don't love Apple so much these days. Feels like they are cheap penny pinching Ebinezer Stooges. I miss the Apple that made it their business to constantly surprise and delight me.) I want to know what is that inflection point that I should look for to help me know when to upgrade. If M2 had the best memory bandwidth, what was that throughput? I may wait for M4 Pro / M4 Max / M5 / M5 Pro / MX to match or exceed that M2 Pro data point.
The Max chips, (M1, M2, M3 12 P-core) has 400 GB/s. The 10 P-core M3 Max is reduced to 300 GB/s. M2 Pro was 200, M3 Pro has 150. For base chips, M1 had 66.67, M2 and M3 100, M4 has 120 GB/s.

If you look at these numbers, M4's 120 GB/s is probably what Apple thinks as the necessary minimum for Apple Intelligence. M3 Pro barely has more.

What you need depends on what you do. I dug into this because I do astrophysical simulations on very long timescales. I repeat the same simple calculations on a large amount of numbers a lot of times. Which means the most time the CPU spends is waiting for RAM. For me RAM speed is king. For you, it could be completely irrelevant. For most people it's not really relevant, that's why the get away with pushing people who need it to more expensive products (or a PC).
 

ric22

Suspended
Mar 8, 2022
2,296
2,179
I'm gonna need to comb through my youtube history to find the video, an AI guy made benchmarks with M1 Pro, M3 Pro/Max and an nvidia card. The scenario is essentially anywhere a gpu would be used, because the main strength over cpus is their higher memory speeds and bandwith. That's why nvidia does the same with their current gen gaming gpus: reducing/holding back memory bus and vram size.
Apple touted the high bandwidth as something amazing, a great selling point, then seemed to realise that 99% of people didn't really utilise it. A video I saw ages ago showed sometime trying to utilise it all on a Max and Pro and failing. https://www.iot-now.com/2024/02/07/...ick comparison, we,memory bandwidth (Table 1). Here the research seems to suggest that to utilise all of the TOPS that Apple provide you probably don't need more bandwidth than a regular M4 provides (if I'm reading it right). If Apple trebled the TOPS numbers then M2 Pro levels of bandwidth would be handy.
 

Manzanito

macrumors 65816
Apr 9, 2010
1,126
1,819
But why would Apple want you to buy a Studio if they’re making so much profit off selling you memory and ssd? Are they making even more profit with Studios? I’d be interested in seeing what the numbers are.
I’m sure they’d be delighted if everyone maxed out every computer.

I don’t know what the numbers are, but it stands to reason to expect that with a more sensible bto pricing there would be a lot less people buying the superior model.
 
  • Like
Reactions: Chuckeee

IIGS User

macrumors 65816
Feb 24, 2019
1,122
3,144
I’m not completely sold on the whole AI thing. First of all, it’s super power hungry. Especially for the half baked tripe it’s kicking out right now. Has anyone considered the carbon footprint of all these AI generated responses on Google I didn’t ask for? The processing is using a LOT of electricity for the sake of making bad term papers and cartoonish cat pictures.

I don’t see how this is going to be “carbon neutral” going forward.
 

Realityck

macrumors G4
Nov 9, 2015
10,723
16,110
Silicon Valley, CA
Currently, Apple's AI cloud servers are believed to be running on multiple connected M2 Ultra chips, which were originally designed solely for desktop Macs. Whenever the M5 is adopted, its advanced dual-use design is believed to be a sign of Apple future-proofing its plan to vertically integrate its supply chain for AI functionality across computers, cloud servers, and software.
It will be interesting to compare how Apple is doing with their ARM processing with cloud servers compared to Google which seems to be loosing its green advantages due to excessive AI usage. This advanced SoIC packaging technology for its M5 chips in the next year will likely further illustrate which company has better technology tackling AI's energy requirements with cloud servers. :cool:

As Google has rushed to incorporate artificial intelligence into its core products — with sometimes less-than-stellar results — a problem has been brewing behind the scenes: the systems needed to power its AI tools have vastly increased the company’s greenhouse gas emissions.
AI systems need lots of computers to make them work. The data centers needed to run them, essentially warehouses full of powerful computing equipment, suck up tons of energy to process data and manage the heat all of those computers produce.
The end result has been that Google’s greenhouse gas emissions have soared 48% since 2019, according to the tech giant’s annual environment report. The tech giant blamed that growth mainly on “increased data center energy consumption and supply chain emissions.”
Now, Google is calling its goal to reach net-zero emissions by 2030 “extremely ambitious,” and said the pledge is likely to be affected by “the uncertainty around the future environmental impact of AI, which is complex and difficult to predict.” In other words: a sustainability push by the company — which once included the slogan “don’t be evil” in its code of conduct — has gotten more complicated thanks to AI.
 

JordanCautious

macrumors regular
Sep 26, 2023
214
534
Looking forward to buying a M5 Max MacBook Pro whenever it comes out. Will definitely keep my eyes on this
 

Churchman

macrumors member
Aug 31, 2022
56
64
The Max chips, (M1, M2, M3 12 P-core) has 400 GB/s. The 10 P-core M3 Max is reduced to 300 GB/s. M2 Pro was 200, M3 Pro has 150. For base chips, M1 had 66.67, M2 and M3 100, M4 has 120 GB/s.

If you look at these numbers, M4's 120 GB/s is probably what Apple thinks as the necessary minimum for Apple Intelligence. M3 Pro barely has more.

What you need depends on what you do. I dug into this because I do astrophysical simulations on very long timescales. I repeat the same simple calculations on a large amount of numbers a lot of times. Which means the most time the CPU spends is waiting for RAM. For me RAM speed is king. For you, it could be completely irrelevant. For most people it's not really relevant, that's why the get away with pushing people who need it to more expensive products (or a PC).
Thanks so much for this!
 

Allen_Wentz

macrumors 68030
Dec 3, 2016
2,899
3,162
USA
I think Apple are at a crossroads here. On the one hand, they love to appeal to ‘creators’ but on the other AI does the creativity for you. Be interesting to see how they pitch that in the future.
Anyone who thinks "AI does the creativity for you" is part of a serious problem.
 

Allen_Wentz

macrumors 68030
Dec 3, 2016
2,899
3,162
USA
I'm still just blown away they are charging one THOUSAND dollars just for your RAM & SSD upgrades there

(At retail prices, that spec goes from $1699 to $2699, just for the RAM/SSD upgrades)
And I am just blown away at the RAM & SSD upgrades we can get today for only one THOUSAND dollars. In the past we routinely paid more than that for RAM & SSD upgrades measured in MB.
 

AppleLeaker

macrumors newbie
Feb 2, 2023
2
1
It's a bit more sophisticated than that. Right now a SoC has all of its components on a monolithic die - the CPU and GPU cores, the NPU, memory controllers, built in USB/Thunderbolt controllers, Secure Enclave, etc. What SoIC allows for is all of those SoC components to be split out and fabricated separately but then integrated back with each other with resulting performance that's actually better than a monolithic SoC die.

So instead of an Ultra being two Max chips being stitched together it could be a large CPU tile joined to a large GPU tile with smaller controller component tiles added on. All of the M series chips would be built this way, not just the Ultra. This would have the side effect of increased yields by making smaller and less complex chiplet dies, and allow Apple to focus limited bleeding edge process node capacity on important sections of the tiles like the CPU and GPU and have controller tiles manufactured on older but still efficient process nodes.
Isn’t this what Intel is doing with their “tile” design? It also lets them fabricate the less important portions of the chip on cheaper nodes like 6nm.
 

BuffyzDead

macrumors regular
Dec 30, 2008
230
327
the idiom; “Apple is quite literally delving into 3-dimensional chess, while everyone else is still playing 2-dimensional chess”, comes to mind.
 

ChrisA

macrumors G5
Jan 5, 2006
12,747
1,912
Redondo Beach, California
Okay. so does this challenge Nvidia's H100 and Blackwell chips? Asking for my AAPL shares.
I think so. Not not a threat to overall performance but performance per dollar.

Appleis is a kind of unique position because they can buy Apple Silicon chips at much lower than market price. If you or Iwere to try and build out a server farm using AS, we'd have to pay close to retail pries for thousands of MacStudios. But Apple can charge themselves a lower price.

But even at full retail price AS is competitive. My M2-Pro is maybe 1/2 as fast as a PC with RTX3070 GPU but the PC would cost a bit more. As you move up the Mac Studio would be about like an Nvidia A6000 but the A6000-based server might cost double the price of a fully loaded Mac Studio.

Then if you need even more you hace to start networking the Apple computers and putting threm in racks.
 

rp2011

macrumors 68020
Oct 12, 2010
2,445
2,817
Is Apple getting back into the server market?
Microsoft, Amazon, Google. Tesla, everyone has been developing their own server chips for a while now. Apple has been the biggest holdout of the big guns so far. Especially for AI.

Cloud services and cloud compute are more important than ever before and no one wants to be beholden to Nvidia, pay their heavy markups, or be on their waitlist for availability.
 
Last edited:

Axemantitan

macrumors 6502a
Mar 16, 2008
538
96
Apple is coming out with new CPU generations at a blazing fast rate. I have trouble keeping up. It's a far cry from the PowerPC days. It just proves that if you want something done right, you have to do it yourself.
 

HDFan

Contributor
Jun 30, 2007
6,957
3,078
What's interesting is that teardowns of the M4 iPad Pros show they actually have 12GB of RAM not 8GB advertised;

where the RAM size isn't advertised prominently

It's in their specifications.

the 1TB+ iPads offer 16GB of RAM, not 12GB

the 1TB+ iPads offer 16GB of RAM

Yes.

The Max chips, (M1, M2, M3 12 P-core) has 400 GB/s. The 10 P-core M3 Max is reduced to 300 GB/s. M2 Pro was 200, M3 Pro has 150. For base chips, M1 had 66.67, M2 and M3 100, M4 has 120 GB/s.

And the M2 Ulta has 800 GB/s.
 

macduke

macrumors G5
Jun 27, 2007
13,300
20,080
Also hoping by this time 16GB intergrated RAM will be the standard RAM size for the Mac.
You know it won’t. At this point we’ll be lucky to start at 12GB. But I hope it does start at 16GB because my intention is to upgrade to the M5 iPad Pro 11” and I don’t want to have to buy a lot of storage to get a decent memory spec. Not that I do a lot on my iPad because iPadOS, but on my M1 which has 16GB, it’s nice to have because apps aren’t always closing out in the background. I would be fine with getting 16GB again, and I’m tired of the large 12.9” size, so whatever is cheap would be great. I’m tired of trying to use my iPad as a computer and just want a great display and sufficient memory.
 

thebart

macrumors 6502
Feb 19, 2023
405
360
Perhaps the M5 will reveal Apple’s plans for the foreseeable future. Data centers and consumer products that do all your work for you. I’d prefer to do my own work but I’m not in charge.
I saw some CEO guy say on an interview in 5 years AI will do 90% of your job so you can chill. Dude when an AI can do 51% of my job they've already fired my ass. I will be chilling alright
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.