TOP News

Technology: Apple's Homegrown Chips Could Be the End for AMD Graphics in Macs

Apple's new Arm-based Macs will run iPhone, iPad apps

  Apple's new Arm-based Macs will run iPhone, iPad apps Apple is bringing its own silicon to its computers, making them more power efficient and more like the company's mobile devices.Apple on Monday, at its first-ever digital Worldwide Developers Conference, said its new Macs will be able to "directly" run apps built for Apple's phones and tablets. The App Store for the 13-year-old iPhone has about 2 million apps, far higher than the number of programs made specifically for the Mac.

Apple’s transition to its own Mac silicon has brought up a slew of questions, and now it seems like the future of the company’s third-party graphics card support in Macs is up for debate.

a man standing in front of a laptop: Apple CEO Tim Cook on-stage at last year’s Worldwide Developers Conference. © Photo: Brittany Hosea-Small/AFP (Getty Images) Apple CEO Tim Cook on-stage at last year’s Worldwide Developers Conference.

According to Apple Insider, the company made it clear during a WWDC 2020 developer session—and in a developer support document—that its home-brewed CPUs will support its own GPUs, too. The company already uses its own GPUs in its other devices with ARM-based processors, like iPads and iPhones. If Apple is parting ways with Intel, who’s to say they won’t also with AMD, who makes the GPUs that power the Mac? There’s no indication one way or another, and Apple is staying silent on the matter, so all that’s left to do is speculate.

7 new iOS 14 features coming to the iPhone that Android already has

  7 new iOS 14 features coming to the iPhone that Android already has Google was ahead of the curve on these features Apple announced at WWDC 2020 for the iPhone.Is the new Apple iOS 14 just Android in disguise?

“Apple Silicon Mac contains an Apple-designed GPU, whereas Intel-based Macs contain GPUs from Intel, AMD and Nvidia,” Gokhan Avkarogullari, Apple’s director of GPU software, said during the WWDC session.

It’s not surprising that that’s the case. Apple confirmed during its WWDC keynote that the company is making its long-rumored move away from Intel processors to its own system-on-chip (SoC) with Apple Silicon, which includes a move to its own integrated GPUs. What’s unclear is what that means for future discrete GPU support. Apple officially stopped supporting Nvidia GPUs when it released macOS Mojave in 2018, but has continued to offer a range of Macs with AMD graphics.

In the near future, AMD discrete GPUs aren’t going anywhere. Apple recently added a new GPU configuration option to its Mac Pro desktop tower, AMD’s Radeon Pro 5500X, and the company said during WWDC that it would be releasing new Intel configurations as well. But one of the things the company pointed out during its developer session is the difference between Apple GPUs and third-party GPUs. Apple GPU architecture is a tile-based deferred renderer (TBDR), and Intel, Nvidia, and AMD are immediate mode renderer GPUs (IMR).

Hand-washing timer, Memoji face masks: How Apple is helping us cope with coronavirus

  Hand-washing timer, Memoji face masks: How Apple is helping us cope with coronavirus iOS 14 and WatchOS 7 are getting new tools to help us stay healthy and happy in the times of COVID-19 -- but you won't see them until the fall.This isn't the first time Apple's helped us deal with the coronavirus. Back in March 2020, the company launched an online and app-based tool to screen for the coronavirus. It also developed coronavirus contact tracing tools with Google, which allow health agencies across many US states to track all of the people that an infected person interacted with, to slow the spread of the virus.

a screenshot of a cell phone © Screenshot: Apple

TBDR captures the entire scene before it starts to render it, splitting it up into multiple small regions, or tiles, that get processed separately, so it processes information pretty fast and doesn’t require a lot of memory bandwidth. From there, the architecture won’t actually render the scene until it rejects any and all occluded pixels.

On the other hand, IMR does things the opposite way, rendering the entire scene before it decides what pixels need to be thrown out. As you probably guessed, this method is inefficient, yet it’s how modern discrete GPUs operate, and they need a lot of bandwidth to do so.

For Apple Silicon ARM architecture, TBDR is a much better match because its focus is on speed and lower power consumption—not to mention the GPU is on the same chip as the CPU, hence the term SoC. This is probably why Apple wrote, “Don’t assume a discrete GPU means better performance,” in its developer support document. It’s all that dang bandwidth it doesn’t need.

WWDC: How Apple's Swift Student Challenge winners are fighting coronavirus

  WWDC: How Apple's Swift Student Challenge winners are fighting coronavirus Students as young as age 12 are creating apps that help keep people healthy during the pandemic, and solving other societal problems.For students, Apple's WWDC Scholars program -- awarded annually to 350 students based on a Swift Playground project -- is a chance to receive free admission to the Worldwide Developers Conference and spend time with Apple experts. With the conference going virtual this year in the wake of the coronavirus pandemic, Apple launched the Swift Student Challenge as a way to connect with young developers -- even if there's no in-person conference. This year, the 350 winners are from 41 different countries and regions.

It could also be a reason why that the Shadow of the Tomb Raider demo (running on Rosetta 2) Apple showed off during its keynote looked so good. I’m no game designer, but if Apple if helping developers port their games to not only its ARM architecture, but its GPU architecture, it just might grow some more teeth in the gaming sphere. And if that happens, Macs might actually become competitive gaming machines once you start to compare benchmarks.

I’d still be highly skeptical of the cost of Apple’s future machines, though, especially since you can currently build or buy a PC with better specs for much less than a Mac. There’s also something to be said about the DIY culture baked into the Windows-based PC market. Apple has generally made its customers rely totally on the company to fix hardware-related issued or upgrade, and if it wants to attract more developers to code their games for its hardware and macOS, understanding the PC gaming culture would go a long way. For some, it might not matter if Apple’s GPUs are technically better.

Like Intel, AMD will stick it out with Apple for as long as it can, until Apple is positive it can survive without any third-party hardware components. Then the walled garden will be fully grown.

We reached out to Apple for comment on its future AMD GPU plans, but have yet to receive a response. We will update if/when we hear back.

This Is the Best Budget CPU .
There was a time when compromising on hardware for your DIY PC build meant seriously compromising on performance. But with AMD able to shrink down its transistor size to 7nm on its Zen 2 architecture, and Intel able to squeeze more and more out of its 14nm process for several generations, even the most budget of budget processors deliver fantastic performance at an extremely reasonable price. I was really impressed with the value of AMD’s Ryzen 3 3100 and 3300X processors, which leveraged some key changes to make these budget CPUs worth far more than what they actually cost. But AMD isn’t the only one with budget CPUs.

See also