Apple Saying Goodbye to Intel Processors

Originally Posted by Carmudgeon
The benchmarks for the DTKs (so much for the NDAs) are promising, and that's running a non-native benchmarking app through the Rosetta translator, on a two-year old tablet chip design.


Funny enough, on some of the Mac websites, I've seen people making fun of the benchmarks.

I suspect that Apple has deliberately handicapped the DTKs such that programs like Geekbench can only run in emulation even though GB has an ARM version.

The About This Mac/System Profiler screenshots I've seen also are missing information that's customarily easily accessible. No one(outside Apple) seems to actually know the true specs of them. There's no GPU information, for example, while on my MBP Big Sure gives the customary GPU information.

To me, the benchmarks at this point are meaningless. The Intel DTKs were basically nothing like Intel hardware that eventually shipped. Among other things, the Intel DTKs had P4 CPUs, while the first production Macs had Core Duo CPUs and quickly transitioned to Core2Duos.

I suspect the "real" ARM based computers will use a totally new CPU rather than a repurposed iPad CPU.
 
Originally Posted by bunnspecial
I suspect the "real" ARM based computers will use a totally new CPU rather than a repurposed iPad CPU.


Based on what I read so far, it would be an 8 core ARM with "big / little" architecture. It wouldn't be totally new vs iPad but they would definitely not waste money developing a one off. At least the "little" cores on the arm mac would be similar to iPad/iPhone for battery life optimization.

Also since Apple already own anobit, they would likely integrate the flash controller in the chip, build raw flash chips and then make their own file system like the iPad. If they do it right they can make the storage very fast.
 
Last edited:
Originally Posted by BearZDefect
If only Apple, Google, and Microsoft would cooperate on a new OS so the world of computing could converge.


They "sort of" did. BSD / Linux are close enough that you can build between them. Even if they don't most of the apps and libraries these days have multiple ports that you can build for one and then build for another. The days that you have to make the low level byte by byte compatible is gone. We can live with higher level emulation or port these days. We are good enough and fast enough for these. They are not running enterprise legacy apps, they are running a chrome book to mac book pro types of machine, user interface and experience are the key here.

The problem actually is more on the "app store" front. Once you pick a platform you are locked in to "buy apps" from them, so nobody can make money other than letting the app store get a 15-30% cut.
 
Last edited:
I heard that the trial Mac mini running arm and a windows emulation blew away a native surface pro windows machine. But I assume there's more to it than this?

Nobody is mistaking an ipad, even a pro, for a legitimate processor.
 
I heard that the trial Mac mini running arm and a windows emulation blew away a native surface pro windows machine. But I assume there's more to it than this?

Nobody is mistaking an ipad, even a pro, for a legitimate processor.

It would really depends on how they pick the benchmark.

Windows on surface pro that have crippled spec (i.e. 14nm Intel atom) to preserve power would be easy to blow away with an arm running full power (on 7nm TSMC). If you compare the trial mac mini against a hackintosh with ThreadRipper then it would almost be impossible in theory (assume both architecture are done well for the same process node but x86 has much larger die size and more parallelism.

The only advantage is their own architecture have no need to be backward compatible with legacy OS or file system -> ssd ftl -> nand management and can go straight around these overhead, or if the arm mac have just as many die size and run in massive parallelism power be **** (unlikely).

This tells you how serious of a trouble Intel is in right now. They have to split up into a foundry and a fabless company like AMD did years ago, or they will become Motorola or IBM.
 
Our iT services company only uses Mac and a portion of developers run Win10 on them. Likely no longer unless they compile a version to run on this hardware?
 
Our iT services company only uses Mac and a portion of developers run Win10 on them. Likely no longer unless they compile a version to run on this hardware?
How do they run Windows, in Boot Camp or in a VM ?
 
I guess we don't to hear the "Macs are just PCs with the same components" mantra too much longer.

You really think Apple likes that kind of talk?

Apple considers themself and arguably are revolutionary, it is a lot like how Samsung made many chips and components of an Apple phone, eventually they come to blows and then they play nice.
 
It would be an interesting turn of events if NVIDIA manages to buy ARM.
 
In what sense?

Apple licenses ARM technology to make their iPhone and iPad in-house chips, and if they plan on making ARM-based desktop CPUs then they need to license that as well. However, if NVIDIA gets control of ARM, that will become exceedingly difficult for Apple, unless Apple is willing to use more of NVIDIA's "stuff" in their products (e.g. GPUs). Not that long ago Apple locked out support for NVIDIA GPUs. NVIDIA can no longer write drivers for macOS because of that. So they're not on the best of terms. We'll see what happens. As it stands, Apple offers extremely poor performing workstations that run on Intel CPUs and AMD GPUs. Most people have moved on and are running AMD Threadripper based workstations with powerful NVIDIA GPUs. Anyway, I'm deviating too much. The bottom line is that Apple wants to compete on their own planet again without having any actual competition as they used to in the PowerPC days. I bet NVIDIA wants to piggyback on Apple's decision and make some money off of it.
 
Apple licenses ARM technology to make their iPhone and iPad in-house chips, and if they plan on making ARM-based desktop CPUs then they need to license that as well. However, if NVIDIA gets control of ARM, that will become exceedingly difficult for Apple, unless Apple is willing to use more of NVIDIA's "stuff" in their products (e.g. GPUs). Not that long ago Apple locked out support for NVIDIA GPUs. NVIDIA can no longer write drivers for macOS because of that. So they're not on the best of terms. We'll see what happens. As it stands, Apple offers extremely poor performing workstations that run on Intel CPUs and AMD GPUs. Most people have moved on and are running AMD Threadripper based workstations with powerful NVIDIA GPUs. Anyway, I'm deviating too much. The bottom line is that Apple wants to compete on their own planet again without having any actual competition as they used to in the PowerPC days. I bet NVIDIA wants to piggyback on Apple's decision and make some money off of it.

Amusingly, I recently upgraded my 10 year old Mac Pro with a newer NVidia GPU, a GTX 680 2GB. I'm running Mojave as I still use some 32-bit apps and getting 10.15 on this unit is requires a "workaround" since it isn't officially supported.
 
Amusingly, I recently upgraded my 10 year old Mac Pro with a newer NVidia GPU, a GTX 680 2GB. I'm running Mojave as I still use some 32-bit apps and getting 10.15 on this unit is requires a "workaround" since it isn't officially supported.

Yes, but you are running that with an Apple driver. That Apple driver supports up to GTX 780 3GB, however, beyond that Apple has locked out support.

Did you flash the EEPROM or are you using the card without a boot screen?

What version of macOS are you on?
 
Last edited by a moderator:
Yes, but you are running that with an Apple driver. That Apple driver supports up to GTX 780 3GB, however, beyond that Apple has locked out support.

Did you flash the EEPROM or are you using the card without a boot screen?

What version of macOS are you on?

Per the post, Mojave, so 10.14.6 as of right now. Yes, card was flashed so I have the boot screen.

And yes, that's the reason I got the 680, lol
 
Per the post, Mojave, so 10.14.6 as of right now. Yes, card was flashed so I have the boot screen.

And yes, that's the reason I got the 680, lol

NVIDIA drivers made by NVIDIA are available up to macOS High Sierra 10.13.6. So if you're on High Sierra 10.13.6 or lower, you can run any NVIDIA GPU, lol provided you download and install the NVIDIA driver from here: https://www.nvidia.com/en-us/drivers/results/162473/

The EVGA GTX 680 Firmware can be modified to run GTX 770 and 780 GPUs. It's a bit of hacking involved, however, if you ever need to do a mild upgrade, let me know and I'll try to help you. Personally, I gave up on Macs a while back because they are very expensive and just not worth the hassle. Currently, I got a Threadripper 1950X 16 core with 128GB RAM, a bunch of SSDs, and a RADEON VII 16GB GPU. Looking to upgrade soon to a 32 core Threadripper 3970X with an RTX 2080Ti GPU. I know the 3090 was just launched, but I got the 2080 months ago... it is what it is, too much to do, too little time. The time spend here on the forum is actually R&R for me.
 
Apple licenses ARM technology to make their iPhone and iPad in-house chips, and if they plan on making ARM-based desktop CPUs then they need to license that as well. However, if NVIDIA gets control of ARM, that will become exceedingly difficult for Apple, unless Apple is willing to use more of NVIDIA's "stuff" in their products (e.g. GPUs). Not that long ago Apple locked out support for NVIDIA GPUs. NVIDIA can no longer write drivers for macOS because of that. So they're not on the best of terms. We'll see what happens. As it stands, Apple offers extremely poor performing workstations that run on Intel CPUs and AMD GPUs. Most people have moved on and are running AMD Threadripper based workstations with powerful NVIDIA GPUs. Anyway, I'm deviating too much. The bottom line is that Apple wants to compete on their own planet again without having any actual competition as they used to in the PowerPC days. I bet NVIDIA wants to piggyback on Apple's decision and make some money off of it.

Apple holds an ARM architecture license, and has been producing its own designs for years. It doesn't have to rely on ARM except for continued use of the instruction set.

ARM has no value without its IP, and the recurring revenues it receives from licensing that IP, and the per unit royalties. Any new owner, whether Nvidia or somebody else, is not going to cut their own throat by putting the screws to one of its best customers, biggest champion, and arguably the developmental leader of the architecture.

Deviating from the neutral approach it has taken with its licensees will also draw the attention of antitrust regulators, as well as turn off its other customers.

From Apple's standpoint, Masayoshi Son has already engaged them to determine their interest in acquiring ARM; Apple listened, but declined. If it was concerned, it could have easily afforded to buy the company, and pay lawyers to fend of the legal actions antitrust regulators would have brought forth.

But, if Nvidia does successfully bid, and does subsequently threaten Apple, the latter could simply pivot, abandon ARM, and develop its own architecture that does not encroach in ARM's IP, just enough. It already has the hardware design expertise, and writes it own dev tools that provide the abstraction layer to insulate software developers from hassle, like it's already doing with moving the Mac to ARM. Those are the kinds of luxuries owning the whole widget affords.

These companies are not schoolyard kids battling each other for bragging rights. They're adults, and as long as everyone is making money by the bucket load, not going to engage in fights just for the sake of it.
 
Apple holds an ARM architecture license, and has been producing its own designs for years. It doesn't have to rely on ARM except for continued use of the instruction set.

ARM has no value without its IP, and the recurring revenues it receives from licensing that IP, and the per unit royalties. Any new owner, whether Nvidia or somebody else, is not going to cut their own throat by putting the screws to one of its best customers, biggest champion, and arguably the developmental leader of the architecture.

Deviating from the neutral approach it has taken with its licensees will also draw the attention of antitrust regulators, as well as turn off its other customers.

From Apple's standpoint, Masayoshi Son has already engaged them to determine their interest in acquiring ARM; Apple listened, but declined. If it was concerned, it could have easily afforded to buy the company, and pay lawyers to fend of the legal actions antitrust regulators would have brought forth.

But, if Nvidia does successfully bid, and does subsequently threaten Apple, the latter could simply pivot, abandon ARM, and develop its own architecture that does not encroach in ARM's IP, just enough. It already has the hardware design expertise, and writes it own dev tools that provide the abstraction layer to insulate software developers from hassle, like it's already doing with moving the Mac to ARM. Those are the kinds of luxuries owning the whole widget affords.

This is when Apple kicked NVIDIA off its platform: https://forums.developer.nvidia.com/t/cuda-10-and-macos-10-14/65672

The rest is history.

You're right, it's all about money, however, I have seen these behemoths make moves that affected many of us, without even batting an eyelash. Neither you nor I can predict the future. NVIDIA needs ARM for other reasons besides wanting to make money with Apple again. NVIDIA will never have an x86 license, yet they would like to have a CPU architecture that they can fully control. Apple's and NVIDIA's issues with each other go way back, and it might have something to do with the closed nature of NVIDIA's software stack that they don't want to open to Apple, or anyone else for that matter, plus some other screw-ups related to bad laptop GPUs that ended up costing Apple money. When it comes down to it, just like billionaires, these extremely wealthy companies can and will behave like little kids but on a very large scale. Just like in the example above. Apple cut out NVIDIA just like that and everyone who spent big money upgrading their ancient Mac Pro with new hardware, especially modded NVIDIA GPUs, is now either running an ancient GPUs or has moved on to another platform.

These companies are not schoolyard kids battling each other for bragging rights. They're adults, and as long as everyone is making money by the bucket load, not going to engage in fights just for the sake of it.

Hell, they'll fight each other while doing business with each other. Apple and Samsung have done it for a long time. The problem is that Apple has become one of the most evil companies on the face of this planet. It's a shame Steve Jobs passed away, I miss him.

Anyway, I have no interest in Apple and I no longer spend money on their products, not even used. I avoid them and I can live my life just fine without an iPhone or an iPad or a Mac around me. I am amazed by NVIDIA's latest product, the RTX 3090. That is a solid GPU that obsoletes everything that came before it without even leaving room for comparison or doubt. It's a hell of a GPU.
 
^^ but this is the "business world" ^^
It doesnt matter if its Apple, HP, Acer, ect ... ect
When it comes to the public and to the consumer Apple stands up for privacy where no google or Microsoft product does.
We all have our reasons to buy a product and you have yours, Im just saying much of your post is the business world in general and Apple products will always perform better to the "general" public right off the shelf then in the PC world.
Lets face it, the general public is completely CLUELESS on proper computer "health"

I am not talking about "gamers" just the general public.
Im only writing this because after 2.5 decades of using Windows I switched to a Mac mini in the fall of 2019 and have been blown away, so much so, Ive used cell phones since 2001 and I ditched my Android cell phone last fall and bought a iPhone XR. I love the simplicity, privacy, security and integration of it all.
I will NEVER go back to Windows even though I like them more then ANYTHING from google, who to me is the enemy of mankind privacy and freedom *LOL*
 
Back
Top