Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
64,281
32,080


Apple will reportedly use a more advanced SoIC packaging technology for its M5 chips, as part of a two-pronged strategy to meet its growing need for silicon that can power consumer Macs and enhance the performance of its data centers and future AI tools that rely on the cloud.

tsmc_semiconductor_chip_inspection_678x452.jpg

Developed by TSMC and unveiled in 2018, SoIC (System on Integrated Chip) technology allows for the stacking of chips in a three-dimensional structure, providing better electrical performance and thermal management compared to traditional two-dimensional chip designs.

According to the Economic Daily, Apple has expanded its cooperation with TSMC on a next-generation hybrid SoIC package that additionally combines thermoplastic carbon fiber composite molding technology. The package is said to be in a small trial production phase, with the intention of mass producing the chips in 2025 and 2026 for new Macs and AI cloud servers.

References to what are believed to be Apple's M5 chip have already been discovered in official Apple code. Apple has been working on processors for its own AI servers made with TSMC's 3nm process, targeting mass production by the second half of 2025. However, according to Haitong analyst Jeff Pu, Apple's plans in late 2025 are to assemble AI servers powered by its M4 chip.

Currently, Apple's AI cloud servers are believed to be running on multiple connected M2 Ultra chips, which were originally designed solely for desktop Macs. Whenever the M5 is adopted, its advanced dual-use design is believed to be a sign of Apple future-proofing its plan to vertically integrate its supply chain for AI functionality across computers, cloud servers, and software.

(Via DigiTimes.com.)

Article Link: Apple M5 Chip's Dual-Use Design Will Power Future Macs and AI Servers
 
Last edited:

krspkbl

macrumors 68020
Jul 20, 2012
2,213
5,310
more like just like the video streaming revolution in the 2000's showed up just how incapable desk top processors were, the Ai revolution is doing the same and the whole industry is caught with it's pants down, Apple included.
true. AI needs powerful hardware. i have a 4080 and all 16GB VRAM gets used up easily. I wish i had got a 4090 only for the extra memory but I think even if I got that I'd wish I had a 48GB GPU lol. I obviously don't need that much or even a 4090 but my point is that we're seeing required specs go up significantly.

we have brand new AI marketed CPUs that can't support basic things like that creepy spyware "feature" Microsoft wants to put into Windows and all that really does is screenshot your PC and analyze everything on it.

if this AI craze sticks around, i think it will, then we're going to see specs increase significantly. For starters, Apple won't be able to get away with 8GB base Macs anymore lol. Poor Apple is going to spend some money upgrading hardware.

Storage is also where we'll see changes too. It might be that future OS installs will get much bigger if on device processing is done.
 

Ramchi

macrumors 65816
Dec 13, 2007
1,101
572
India
Eventually, Apple will use all its tech only for the consumer market than enterprise ones. Microsoft is successful in keeping its enterprise market. Apple, typically supports Developers and Small businesses than huge data centres run by big corporations. Apple is trying hard to keep its hardware relevant in the AI Space but it is probably behind in the actual technology since their expertise in data engineering is not in the same leagues that of other front liners like Google, FB, Amazon, Microsoft etc….
 

DelayedGratificationGene

macrumors 6502a
Jan 11, 2020
839
2,878
Eventually, Apple will use all its tech only for the consumer market than enterprise ones. Microsoft is successful in keeping its enterprise market. Apple, typically supports Developers and Small businesses than huge data centres run by big corporations. Apple is trying hard to keep its hardware relevant in the AI Space but it is probably behind in the actual technology since their expertise in data engineering is not in the same leagues that of other front liners like Google, FB, Amazon, Microsoft etc….
Um maybe. I don’t work for Apple so I have literally no idea what they are going to definitely do or even better what is actually going to happen.
 

bradman83

macrumors 65816
Oct 29, 2020
1,074
2,678
Buffalo, NY
This sounds like the theorized "Double Ultra" chips that would have been two M1 Ultra Chipsets stacked on top of each other, connected at the beltline interface between two Mx Maxs.

View attachment 2394351
It's a bit more sophisticated than that. Right now a SoC has all of its components on a monolithic die - the CPU and GPU cores, the NPU, memory controllers, built in USB/Thunderbolt controllers, Secure Enclave, etc. What SoIC allows for is all of those SoC components to be split out and fabricated separately but then integrated back with each other with resulting performance that's actually better than a monolithic SoC die.

So instead of an Ultra being two Max chips being stitched together it could be a large CPU tile joined to a large GPU tile with smaller controller component tiles added on. All of the M series chips would be built this way, not just the Ultra. This would have the side effect of increased yields by making smaller and less complex chiplet dies, and allow Apple to focus limited bleeding edge process node capacity on important sections of the tiles like the CPU and GPU and have controller tiles manufactured on older but still efficient process nodes.
 

gusping

macrumors 68000
Mar 12, 2012
1,938
2,134
true. AI needs powerful hardware. i have a 4080 and all 16GB VRAM gets used up easily. I wish i had got a 4090 only for the extra memory but I think even if I got that I'd wish I had a 48GB GPU lol. I obviously don't need that much or even a 4090 but my point is that we're seeing required specs go up significantly.

we have brand new AI marketed CPUs that can't support basic things like that creepy spyware "feature" Microsoft wants to put into Windows and all that really does is screenshot your PC and analyze everything on it.

if this AI craze sticks around, i think it will, then we're going to see specs increase significantly. For starters, Apple won't be able to get away with 8GB base Macs anymore lol. Poor Apple is going to spend some money upgrading hardware.

Storage is also where we'll see changes too. It might be that future OS installs will get much bigger if on device processing is done.
Dont worry. Nvidia will soon charge $2500 for a 5090 (severely cut down to 448-bit and 28GB Vram). You’re welcome.
 
  • Like
Reactions: rezwits

UltimaKilo

macrumors 6502a
Nov 14, 2007
936
840
FL
true. AI needs powerful hardware. i have a 4080 and all 16GB VRAM gets used up easily. I wish i had got a 4090 only for the extra memory but I think even if I got that I'd wish I had a 48GB GPU lol. I obviously don't need that much or even a 4090 but my point is that we're seeing required specs go up significantly.

we have brand new AI marketed CPUs that can't support basic things like that creepy spyware "feature" Microsoft wants to put into Windows and all that really does is screenshot your PC and analyze everything on it.

if this AI craze sticks around, i think it will, then we're going to see specs increase significantly. For starters, Apple won't be able to get away with 8GB base Macs anymore lol. Poor Apple is going to spend some money upgrading hardware.

Storage is also where we'll see changes too. It might be that future OS installs will get much bigger if on device processing is done.
I honestly don’t believe in the on-device AI, at least not in the next 5-6 years. Sure, there will be some simple tasks that can be processed locally, and the hybrid model will eventually slowly be moved locally, but a hybrid model where most of the AI processing is done on the server side is probably here for the foreseeable future.

Hardware advancements have been slow over the last 5-8 years, but we might be, out of necessity, on the verge of seeing some really big advancements for consumers in the next 2-4 years.

It’s great to see Apple is working on custom server chips too.
 

theluggage

macrumors 604
Jul 29, 2011
7,667
7,826
Is Apple getting back into the server market?
"Server" covers a lot of different uses. I'm sure the focus will be on systems designed to drive AI and high-performance computing services - a booming requirement - rather than file sharing/email/web-hosting. ...but there is already stiff competition there, including power efficient ARM based systems. The competition will include AWS, who have their own ARM-based processors, and Nvidia's Grace Hopper chip (which is a good illustration of where multi-chip technology might come in).

They never left. The M1 Mac mini for $275 lightly used runs circles around the Xserves
There's a difference between hardware that can be used as a server - the Mac Mini - and a system designed specifically for datacentre applications - like the XServe. Datacentre systems have features nothing to do with the CPU, like
  • being designed from the ground up for high density rack mounting (including layout and cooling)
  • dual redundant power supplies (PSU failure is a onec-in-a-blue-moon a single personal computer, in a datacentre with hundreds of the things, it must be Thursday!)
  • hot-swap/RAID disc drives (same logic as above)
  • lights-out power management (for those times when you need to push the power button - again, same logic)
When the PPC XServe was released it had several Unique Selling Points: the PPC still had a claim to be better than Intel, Linux hadn't gained widespread acceptance - wheras Mac OS was certified as Unix which gave it cred (and let it run a lot of standard software) and the competition was proprietary stuff like Novel Netware and commercial Unix systems which (a) often offered shoddy (& expensive) support for file sharing and mail on Mac networks and (b) had expensive per-seat licensing plans.

When the Intel XServe was discontinued, it really had little left to distinguish it from cheap & cheerful Intel servers using the same processors. Customers that still used Windows Server or commercial Unix did so because they weren't inclined to change - and personal Macs had adapted to work in such environments (e.g. shifting from AppleTalk and Apple file sharing protocol to TCP/IP and SMB) and the industry was moving towards Linux and open web/internet protocols and open source server software anyway. An XServe could do the job perfectly well - but so could a generic Intel or AMD server running Linux, for less money. The Mac's primary advantage was its GUI and cool industrial design - which doesn't count for much in a server room running software that techies can configure using text files. Take the GUI and user-centric Apps away and it was an x86 system running BSD - Open Group certification still counts for something, but Linux is now the de-facto standard setter. Even Apple failed to eat their own dog food and filled their data centres with x86 black boxes.

What's changed now is that Apple are in the CPU and GPU business like never before so they have the opportunity to produce something distinctive for AI and HPC services. The M1/2/3 series don't really cut the mustard, but Apple have the building blocks to make something special - if there is a big enough market to justify the development, which there probably isn't for Mac Pros.

However, the main market for future Apple server kit could be Apple themselves - to run services that not only make money but sell iPhones and Macs - if they don't want to keep contracting their AI out to third parties.
 

bradman83

macrumors 65816
Oct 29, 2020
1,074
2,678
Buffalo, NY
Also hoping by this time 16GB intergrated RAM will be the standard RAM size for the Mac.
What's interesting is that teardowns of the M4 iPad Pros show they actually have 12GB of RAM not 8GB advertised; 2x 6GB RAM modules instead of 2x 4GB (based on the serial numbers off of the physical LPDDR5X chips attached to the SoC).

Supposedly 6GB modules are currently cheaper than 4GB modules so it could be cheaper for Apple to put 12 in the iPad and call it 8 (giving them the option to drop it back down to 8 if the pricing changes, hence why they're not advertising it as 12). Or it could be that Apple is building M4 chips with 12 by default and that 12GB will be the new starting tier for base Macs. While not the jump to 16 most users are hoping for it at least would get us off of 8.
 

vkd

macrumors 6502a
Sep 10, 2012
977
372
As soon as M1 was launched, it was obvious that in some near future M5 would also appear. As will other M's of increasing numerology as well as - get this - capability.
 
  • Haha
Reactions: b0fh and wilhoitm
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.