Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

schneeland

macrumors regular
May 22, 2017
234
764
Darmstadt, Germany
Apple is a direct development partner to TSMC. They literally are the only reason N3 (by way of N3B) was feasible to even produce after everyone else pulled out. Apple’s capital investment in TSMC’s lines make it almost a certainty that they’ll be first in line for each new node because the nodes themselves aren’t economically viable without massive upfront capital investment.
Yes, I'm not doubting that Apple would be first in line for N2. I'm just not sure if a) N2 will really be ready for mass production next year and b) if it is, if Apple will use it for the M chips or only for the iPhone (with M chips following 2026).
 

ric22

Suspended
Mar 8, 2022
2,296
2,179
It's going to pain Apple to spend the extra ~$20 required per Mac to double the base RAM and storage on every device... the fact they will no longer be able to get away with pathetic amounts of both is perhaps the greatest thing about AI on Macs 😅
 

Asbow

macrumors regular
Aug 17, 2020
199
361
in 2007 Apple Computer, Inc became Apple, Inc.

Yet today, Apple is more of a “computer” company than they ever were before the name change. Today they design their own CPUs/GPUs, OS, programming languages, developer tools, and now cloud AI servers.

Two big things they do not do themselves are R&D for new chip fabrication technology and manufacturing/assembly. They seem pretty steadfast in their avoidance of manufacturing, but I wonder if they might get into fab process R&D. It might be appealing to them to own some special fab sauce that nobody else has.
I’m surprised they don’t.
 
  • Like
Reactions: xmach

turbineseaplane

macrumors P6
Mar 19, 2008
15,967
35,259
It's going to pain Apple to spend the extra ~$20 required per Mac to double the base RAM and storage on every device... the fact they will no longer be able to get away with pathetic amounts of both is perhaps the greatest thing about AI on Macs 😅

Sleeping in the bed they made

It's delicious
 
  • Like
Reactions: rezwits and ric22

firewood

macrumors G3
Jul 29, 2003
8,116
1,358
Silicon Valley
Okay. so does this challenge Nvidia's H100 and Blackwell chips?

Likely a good challenge in TOPS per Watt. And the TOPS Per Watt numbers are very important when the largest data centers are constrained by power and cooling, not by chip count. And the ML TOPS per dollar might also beat Blackwells for Apple, as Apple doesn't have to pay any Nvidia markup when using wafers of their own design built on TSMC fab lines that they partially financed.
 
  • Like
Reactions: KENESS and xmach

theluggage

macrumors 604
Jul 29, 2011
7,692
7,894
Supposedly 6GB modules are currently cheaper than 4GB modules so it could be cheaper for Apple to put 12 in the iPad and call it 8 (giving them the option to drop it back down to 8 if the pricing changes, hence why they're not advertising it as 12).
Has anybody confirmed whether these iPads actually show 12GB, or if they've been knobbled to 8GB?

Apple can probably get away with that on the iPad - where the RAM size isn't advertised prominently and they don't offer BTO RAM upgrades separate from storage bumps. I think it would go down like a lead balloon if they tried it on a Mac and then offered a $200-for-8GB upgrade.

Or it could be that Apple is building M4 chips with 12 by default and that 12GB will be the new starting tier for base Macs. While not the jump to 16 most users are hoping for it at least would get us off of 8.
If I recall correctly, the 1TB+ iPads offer 16GB of RAM, not 12GB. If they're planning to drop the 8GB Mac RAM. If they're going to offer 12GB and 24GB Macs I don't really see a niche for a 16GB SoC. So the "temporary freebie thanks to price fluctuations" sounds more plausible.

Still, it's high time that the Macs went to 16GB minimum (except maybe for the cheapest MBA, which - when M4 comes out - will probably be the current 8GB M3 model anyway) especially since the new Copilot+ PCs (with ARM-based SoC processors) start at 16GB.
 

HVDynamo

macrumors 6502a
Feb 21, 2011
727
1,103
Minnesota
Has anybody confirmed whether these iPads actually show 12GB, or if they've been knobbled to 8GB?

Apple can probably get away with that on the iPad - where the RAM size isn't advertised prominently and they don't offer BTO RAM upgrades separate from storage bumps. I think it would go down like a lead balloon if they tried it on a Mac and then offered a $200-for-8GB upgrade.


If I recall correctly, the 1TB+ iPads offer 16GB of RAM, not 12GB. If they're planning to drop the 8GB Mac RAM. If they're going to offer 12GB and 24GB Macs I don't really see a niche for a 16GB SoC. So the "temporary freebie thanks to price fluctuations" sounds more plausible.

Still, it's high time that the Macs went to 16GB minimum (except maybe for the cheapest MBA, which - when M4 comes out - will probably be the current 8GB M3 model anyway) especially since the new Copilot+ PCs (with ARM-based SoC processors) start at 16GB.
Yeah, I think even the cheapest of the cheapest machines should be shipping with 16GB minimum today. The Pro's should start at either the odd 24GB or 32GB preferably. I think even 12GB on the base model stuff is just OK, but not really enough of an increase, even if the AI stuff eats 4GB, then we are still effectively left with 8GB for all other things which is just where we are now that isn't enough. I've been on the 8GB starting is dumb train for the last 3 years or so. I didn't even like it on the M1 for longevity sake, but now it's just absurd.
 

txscott

macrumors regular
Oct 17, 2012
201
472
in 2007 Apple Computer, Inc became Apple, Inc.

Yet today, Apple is more of a “computer” company than they ever were before the name change. Today they design their own CPUs/GPUs, OS, programming languages, developer tools, and now cloud AI servers.

Two big things they do not do themselves are R&D for new chip fabrication technology and manufacturing/assembly. They seem pretty steadfast in their avoidance of manufacturing, but I wonder if they might get into fab process R&D. It might be appealing to them to own some special fab sauce that nobody else has.
It appears to me that AAPL's strategy is to avoid owning capital assets and hiring people in its supply chain operations where it can. This gives it flexibility and mitigates risk by keeping hundreds of thousands of employees in jurisdictions across the globe and hundreds of billions of dollars of assets that can become obsolete, off the books.

AAPL is able to do this at a cost and quality level most other manufactures cannot achieve with their own facilities because of the scale at which they operate and the level of effort they invest in managing their partners and the processes. This "operational art" is the value that Cook brings and a significant reason why AAPL has a $3 trillion market cap.
 

The_Martini_Cat

macrumors 6502
Aug 4, 2015
296
331
So, because the chips are stacked on top of each other rather than on a long roadway to the edge of the chip die, they can communicate quicker? I know when I was working on the drum machine there was a clock signal line (wire) that went all the way around the edge, a huge, long distance, problem, and it actually did cause a problem! The board’s performance was flaky. And it seems silly to even think that, because electrical signals travel at the speed of light, right? But… Distance is distance. Trust me on this one.🍸😹🙀😹
 
  • Like
Reactions: Allen_Wentz

ChrisA

macrumors G5
Jan 5, 2006
12,749
1,912
Redondo Beach, California
more like just like the video streaming revolution in the 2000's showed up just how incapable desk top processors were, the Ai revolution is doing the same and the whole industry is caught with it's pants down, Apple included.
Not really. My M2-Pro with 16GB RAM is running a local copy of Meta's Open Source Llama3 8b at a very decent speed of 28 tokens per second and that is while also running the usual Mac desktop, Chrome browser, and some other stuff.

I'd say my Mac is keeping up with an Nvidia 2070 GPU, The 2070 might be a little faster for running AI models but it is close enough. The better spec'd Macs can be up to 4X faster than the 2070, for AI. (I have no interest in Video and games, I'm only comparing AI tasks.)

So I have to disagree and say Apple did quite well with cost vs performance on AI tasks. The M4 is likely much better than my M2 for this kind of work. I'd upgrade except that it is my limited brainpower holding back the development of a robotics project, it is not yet the M2 hardware that is the bottleneck.

I might be at the point where the M2 is "slow" about the time the M5 is released.

One More Thing: Read the comments and doc that were released my Apple with thier open source OpenELM. IT says they are using Linux to train the model (We don't know if it Linux on Intel or Linux on Apple Silicon) because they say "Slurm is not yet running on MacOS." So apparently as of a few months ago, Apple had not yet built out a MacOS-based AI training farm. I'd guess it was Linux on Intel with Nvidia GPUs. But Apple's OpenELM does run on the M2 and even on an iPhone. This is Apple's "edge AI" that runs locally on customer devices. It has been out for a while now. Google "OpenELM" to find it. You will see that it runs well, Apple is not so "behind the curve".
 

dannys1

macrumors 68040
Sep 19, 2007
3,702
6,869
UK
I think Apple are at a crossroads here. On the one hand, they love to appeal to ‘creators’ but on the other AI does the creativity for you. Be interesting to see how they pitch that in the future.

If they use Gen AI the way they're doing so far I think it's fine - it's mainly for fun things. Creating Emoji I thought was a genuis idea to be fair.

I'm not sure they'd ever have the very best image generating AI models either.
 
  • Like
Reactions: sleeptodream

ChrisA

macrumors G5
Jan 5, 2006
12,749
1,912
Redondo Beach, California
So, because the chips are stacked on top of each other rather than on a long roadway to the edge of the chip die, they can communicate quicker? I know when I was working on the drum machine there was a clock signal line (wire) that went all the way around the edge, a huge, long distance, problem, and it actually did cause a problem! The board’s performance was flaky. And it seems silly to even think that, because electrical signals travel at the speed of light, right? But… Distance is distance. Trust me on this one.🍸😹🙀😹
Light moves about one foot per nanosecond, give or take a little. Electronic pulses in wire move a bit slower than light in a vacuum. The problem is not the delay but gettinght same pluse to every place it is needed at the same time, so the parts stay in sync. It is this "clock skew" that they are trying to fight
 

sleeptodream

macrumors 6502
Aug 29, 2022
286
708
I think Apple are at a crossroads here. On the one hand, they love to appeal to ‘creators’ but on the other AI does the creativity for you. Be interesting to see how they pitch that in the future.
I could be wrong but I don’t think we’ll see Apple release any kind of AI that would steal work from legitimate creatives.

Everything they’ve announced so far are either for editing your writing or photos, or generating purposefully gimmicky things like emoji and cartoonish/sketch-like pictures, nothing photorealistic.

It’s not writing an essay for you, it’s taking one you wrote & rewording it. The pictures it generates are not ones anyone would have paid to have made, and it has set styles so people will know it was AI generated. I think this is all very intentional
 

Realityck

macrumors G4
Nov 9, 2015
10,723
16,121
Silicon Valley, CA
Currently, Apple's AI cloud servers are believed to be running on multiple connected M2 Ultra chips, which were originally designed solely for desktop Macs. Whenever the M5 is adopted, its advanced dual-use design is believed to be a sign of Apple future-proofing its plan to vertically integrate its supply chain for AI functionality across computers, cloud servers, and software.
I like to see more articles discussing what Apple uses in their server farms. Something that isn't discussed enough.
 
  • Like
Reactions: xmach

ChrisA

macrumors G5
Jan 5, 2006
12,749
1,912
Redondo Beach, California
If they use Gen AI the way they're doing so far I think it's fine - it's mainly for fun things. Creating Emoji I thought was a genuis idea to be fair.

I'm not sure they'd ever have the very best image generating AI models either.
Right now the AI is used for stuff that is mostly only entertaining. But the intented is to using it for all the normal stuff you do with a computer like "Siri, I want torepay is susan's email about the 34th street job. Tell her, "let's go with her option #2".

The AI then figures oput which email this is, reads it the find option #2 and then composes a reply and puts it on screen with both "OK" and "edit" buttons. Not entertaining, just basic business. This is what Apple is working on now.

If AI is done right, it will not seem like AI, work under the hood and do what you want. One of my goals, I work on would be a robot and you ask it, "I my UPS package here yet? Bring it inside when you can." The robot's AI wopuld lok for delivery emails or simply look at the outdoor security camera (ring?) and when the package is there, it opens the door and brings it inside and shuts the door. This not 100% briilent but still way more advanced then anything we have now. The abilty to be very simple-minded tasks like this would be revolutionary, folding laundry and placing dishes in the dishwaters, taking out the garbage, Taking grosheries out of the bag and putting them away, cleaning countertops and so on.. No great feat is intellectualism but still very useful. I figure people would pay the price of a car ($35K?) for this. IMO, the "AI" should disappear into the product.
 

subjonas

macrumors 603
Feb 10, 2014
5,856
6,261
For that price one might as well buy the studio. Which, of course, is the whole point of apple’s memory and ssd upgrading price scheme.
But why would Apple want you to buy a Studio if they’re making so much profit off selling you memory and ssd? Are they making even more profit with Studios? I’d be interested in seeing what the numbers are.
 

Cervisia

macrumors newbie
Jun 22, 2024
28
82
So I have to disagree and say Apple did quite well with cost vs performance on AI tasks. The M4 is likely much better than my M2 for this kind of work. I'd upgrade except that it is my limited brainpower holding back the development of a robotics project, it is not yet the M2 hardware that is the bottleneck.
Your M2 Pro is better equipped to run ai tasks than an m4, or even the m3 pro, because it has higher memory bandwidth. Remember when people were making up excuses for cutting back on specs that it doesn't matter in anything average users would do? Well, not true anymore. Reducing memory bandwiths - with 10 core m3 max too - was probably a deliberate step to make the cheaper chips worse at ai tasks. Even the M1 Pro beats the M3 Pro because of the bandwith, despite the M3 Pro having much better gpu cores. Don't upgrade.
 
  • Disagree
  • Wow
Reactions: eas and Chuckeee

ric22

Suspended
Mar 8, 2022
2,296
2,179
Your M2 Pro is better equipped to run ai tasks than an m4, or even the m3 pro, because it has higher memory bandwidth. Remember when people were making up excuses for cutting back on specs that it doesn't matter in anything average users would do? Well, not true anymore. Reducing memory bandwiths - with 10 core m3 max too - was probably a deliberate step to make the cheaper chips worse at ai tasks. Even the M1 Pro beats the M3 Pro because of the bandwith, despite the M3 Pro having much better gpu cores. Don't upgrade.
In what benchmark or scenario does an M1 Pro beat an M3 Pro? And I'm on your side regarding my dislike of how they handicapped the M3 Pro in hopes of selling more Max chips!
 

Churchman

macrumors member
Aug 31, 2022
56
64
Your M2 Pro is better equipped to run ai tasks than an m4, or even the m3 pro, because it has higher memory bandwidth. Remember when people were making up excuses for cutting back on specs that it doesn't matter in anything average users would do? Well, not true anymore. Reducing memory bandwiths - with 10 core m3 max too - was probably a deliberate step to make the cheaper chips worse at ai tasks. Even the M1 Pro beats the M3 Pro because of the bandwith, despite the M3 Pro having much better gpu cores. Don't upgrade.

Hi Cervisia - are there stats on this? I would like to understand the memory bandwidth issue better because I need to upgrade my MBP to Apple M-Class chips from my current Mac-Intel Chip. I don't want an AI chip. But I also don't want a chip that has been intentionally degraded in terms of memory bandwidth just so that Apple can reduce production costs. (I don't love Apple so much these days. Feels like they are cheap penny pinching Ebinezer Stooges. I miss the Apple that made it their business to constantly surprise and delight me.) I want to know what is that inflection point that I should look for to help me know when to upgrade. If M2 had the best memory bandwidth, what was that throughput? I may wait for M4 Pro / M4 Max / M5 / M5 Pro / MX to match or exceed that M2 Pro data point.
 

tim_apple

macrumors member
Mar 15, 2019
37
41
I honestly don’t believe in the on-device AI, at least not in the next 5-6 years. Sure, there will be some simple tasks that can be processed locally, and the hybrid model will eventually slowly be moved locally, but a hybrid model where most of the AI processing is done on the server side is probably here for the foreseeable future.

Hardware advancements have been slow over the last 5-8 years, but we might be, out of necessity, on the verge of seeing some really big advancements for consumers in the next 2-4 years.

It’s great to see Apple is working on custom server chips too.
Have you tried on-device AI (LLMs)? It’s actually quite powerful.
 

dannys1

macrumors 68040
Sep 19, 2007
3,702
6,869
UK
Right now the AI is used for stuff that is mostly only entertaining. But the intented is to using it for all the normal stuff you do with a computer like "Siri, I want torepay is susan's email about the 34th street job. Tell her, "let's go with her option #2".

The AI then figures oput which email this is, reads it the find option #2 and then composes a reply and puts it on screen with both "OK" and "edit" buttons. Not entertaining, just basic business. This is what Apple is working on now.

If AI is done right, it will not seem like AI, work under the hood and do what you want. One of my goals, I work on would be a robot and you ask it, "I my UPS package here yet? Bring it inside when you can." The robot's AI wopuld lok for delivery emails or simply look at the outdoor security camera (ring?) and when the package is there, it opens the door and brings it inside and shuts the door. This not 100% briilent but still way more advanced then anything we have now. The abilty to be very simple-minded tasks like this would be revolutionary, folding laundry and placing dishes in the dishwaters, taking out the garbage, Taking grosheries out of the bag and putting them away, cleaning countertops and so on.. No great feat is intellectualism but still very useful. I figure people would pay the price of a car ($35K?) for this. IMO, the "AI" should disappear into the product.

I agree. I mean original Siri is AI, but people don't think of it like that anymore. New Siri is basically what they made out original Siri would be - a way for you to say what you want in so many different ways and with so many different words you can just speak naturally and have it action your command.
 

name99

macrumors 68020
Jun 21, 2004
2,333
2,219
Is Apple getting back into the server market?
Define "market"...

They will certainly make large machines for themselves.
Will they sell those machines to others? Maybe? But maybe via a special program where there's no real public sale, more like buying a nVidia DGX than like buying a Mac.
Or maybe they won't sell them, all you can do is rent small or large amounts of time, like AWS.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.