Details on Qualcomm's Adreno X1 GPU Architecture Emerge (2024)

');$('.tpu-fancybox-wrap').css('maxWidth', maxWidth);*/instance.$refs.stage.on('transitionend', function() {updateButtonPos(instance);});},onUpdate: updateButtonPos,afterShow: function(instance, slide) {updateButtonPos(instance);instance.$refs.inner.find('.fancybox-tpu-nav').show();},beforeClose: function(instance, slide) {instance.$refs.inner.find('.fancybox-tpu-nav').hide();},afterClose: function(instance, slide) {$('.tpu-fancybox-wrap').contents().unwrap();$('body').removeClass('tpu-fancybox-body-wrap')},baseTpl: '

',});});}loadjs.ready(['jquery', 'fancybox', 'swiper'], function() {attachLightbox('a[data-fancybox]');if ($(window).width()<600) {$('.imgcontainer').each(function() {var $this=$(this);if (($this.find('a').length==1) || ($this.find('a').length>7))return;$this.addClass('swiper-container');$this.find('a').addClass('swiper-slide').css('width', 'auto').wrapAll('

');new Swiper ($this.eq(0), { slidesPerView: 'auto', slidesPerGroup: 1, spaceBetween: 15, pagination: { el: '.swiper-pagination', clickable: true } });});}$('.newspost').on('click', '.spoiler > .button, .spoiler > a', function(e) {e.preventDefault();$(this).next('div').toggle();});$('.newspost').on('click', '.ispoiler', function(e) {e.preventDefault();$(this).find('div').css('filter', '');$(this).removeClass('ispoiler');});$('.contnt').on('click', '.newspoll_btn', function() {popup.Show('TechPowerUp Quick Poll','Loading...');$.get('/news-poll/options?id='+$(this).data('id'), function(data) {$('#popup_content').html(data);});});});

Thursday, June 13th 2024

Details on Qualcomm's Adreno X1 GPU Architecture Emerge (1)

by

Nomad76
Discuss (24 Comments)

Qualcomm has provided new details about the Adreno X1 GPU integrated into their latest series of processors, Snapdragon X Elite/Plus. The Adreno X1 is the first custom-designed GPU from Qualcomm for Windows on ARM systems. The highest configuration of this GPU, known as the "X1-85," features "8" indicating the GPU level and "5" denoting the SKU.

The Adreno X1 boasts up to 6 shaders, 1536 FP32 ALUs, and can handle 96 texels per cycle. It delivers up to 4.6 TFLOPS of peak performance and processes 72 million pixels per second. It supports major graphics APIs, including DirectX 12.1 (Shader Model 6.7), DirectX 11, Vulkan 1.3, and OpenCL 3.0.

Qualcomm compared the Adreno X1-85 to Intel's Core Ultra 7 155H mobile processor with 8 Xe cores, claiming that the Adreno X1-85 matches or exceeds Intel GPU performance in many games at 1080p resolution. However, Qualcomm did not provide specific details about the game settings or test platform.

Additionally, Qualcomm introduced the "Adreno Control Panel," an app for game optimization and driver updates, which Qualcomm promises to update at least once a month.

Qualcomm Snapdragon X laptops will be available from June 18, however, they can already be preordered.

Sources:IThome, Videocardz

Related News

  • Tags:
  • Adreno
  • Qualcomm
  • Snapdragon
  • Snapdragon X
  • Snapdragon X Elite
  • Mar 12th 2024 Qualcomm Snapdragon X Elite Benchmarked Against Intel Core Ultra 7 155H (55)
  • Dec 6th 2023 Set Your Calendars: Windows 12 is Coming in June 2024 with Arm Support and AI Features (163)
  • May 20th 2024 Qualcomm Snapdragon X Series is the Exclusive Platform to Power Windows PCs with Copilot+ (10)
  • May 16th 2024 ASUS Leaks its own Snapdragon X Elite Notebook (24)
  • Apr 24th 2024 Qualcomm Continues to Disrupt the PC Industry with the Addition of Snapdragon X Plus Platform (35)
  • May 20th 2024 Lenovo Supercharges Copilot+ PCs with Latest Yoga Slim 7x and ThinkPad T14s Gen 6 20 May 2024 (2)
  • Apr 17th 2024 Lenovo Prepares Thinkpad T14s and Yoga Slim 14 Laptops with Qualcomm Snapdragon X Processor (13)
  • Feb 22nd 2024 Qualcomm "Snapdragon 8s Gen 3" SoC with Adreno 735 GPU Gets Geekbenched (0)
  • Feb 26th 2024 Qualcomm Snapdragon X Elite "X1E80100" CPU Gets Geekbenched (8)
  • May 20th 2024 ASUS Debuts ASUS Vivobook S 15, its First Copilot+ PC Packed With Windows 11 AI Features (1)
Add your own comment
#1
natr0n

would be nice for discrete cards maybe one day like the days of old. Many options for the same goal.

#2
AnotherReader

4.6 TFlops despite being a 50% wider GPU than the one in the Core Ultra 155H indicates relatively low clocks speeds: around 1.5 GHz at best. They are surprisingly light on GPRs which would impact performance in register-limited scenarios. Intel also has a relatively large L2 feeding fewer shading units so I'm rather skeptical of Qualcomm's numbers. In their graph, the gap between the Radeon 780M and Meteor Lake is far greater than in reality.

#3
TheLostSwede

News Editor

AnotherReader4.6 TFlops despite being a 50% wider GPU than the one in the Core Ultra 155H indicates relatively low clocks speeds: around 1.5 GHz at best. They are surprisingly light on GPRs which would impact performance in register-limited scenarios. Intel also has a relatively large L2 feeding fewer shading units so I'm rather skeptical of Qualcomm's numbers. In their graph, the gap between the Radeon 780M and Meteor Lake is far greater than in reality.

Keep in mind that the Snapdragon X Elite has a 8 x 16-bit memory interface with LPDDR5x-8448 that can deliver 135GB/sec. This might be how they're able to deliver better GPU performance, as the GPU won't be as bandwidth starved.

#4
AnotherReader
TheLostSwedeKeep in mind that the Snapdragon X Elite has a 8 x 16-bit memory interface with LPDDR5x-8448 that can deliver 135GB/sec. This might be how they're able to deliver better GPU performance, as the GPU won't be as bandwidth starved.

Meteor Lake also supports LPDDR5X at speeds of up to 7467 MT/s. We'll see when reviewers get their hands on these systems. Even according to Qualcomm's own slides, in quite a few of the tested games, the two have equivalent performance with Horizon Zero Dawn and Rocket League being notable outliers.

#5
TheLostSwede

News Editor

AnotherReaderMeteor Lake also supports LPDDR5X at speeds of up to 7467 MT/s. We'll see when reviewers get their hands on these systems. Even according to Qualcomm's own slides, in quite a few of the tested games, the two have equivalent performance with Horizon Zero Dawn and Rocket League being notable outliers.

Another issue is likely to be x86 emulation, if you have a look at the bottom of this page.
www.anandtech.com/show/21445/qualcomm-snapdragon-x-architecture-deep-dive/2
It's not clear if the games tested are running in emulation mode or not, but I presume most are.

Also, I can't find a single review that lists the memory specs of the Acer Swift Go 14 that was used for the testing.

#6
AnotherReader
TheLostSwedeAnother issue is likely to be x86 emulation, if you have a look at the bottom of this page.
www.anandtech.com/show/21445/qualcomm-snapdragon-x-architecture-deep-dive/2
It's not clear if the games tested are running in emulation mode or not, but I presume most are.

That is a very good point. However, in a power constrained laptop, the overhead of CPU emulation is unlikely to shift the bottleneck from the GPU. Still, it would be best if we had ARM native games in the benchmark suite.

#7
LabRat 891
natr0nwould be nice for discrete cards maybe one day like the days of old. Many options for the same goal.

IIRC, that one Chinese company trying to break into GPUs
based their uArch off mobile PowerVR/S3 (common in early smart phones and ARM-mobile, IIRC)
I *really* miss PowerVR, in particular.

As an aside,
The irony might be palpable. Adreno, long-ago had relation to ATI Radeon.
Hence, the anagram R A D E O N <-> A D R E N O

#8
dj-electric

I wish Qualcomm luck with Windows GPU drivers for new and old games.
They are going to need all of it.

#9
Denver
TheLostSwedeKeep in mind that the Snapdragon X Elite has a 8 x 16-bit memory interface with LPDDR5x-8448 that can deliver 135GB/sec. This might be how they're able to deliver better GPU performance, as the GPU won't be as bandwidth starved.

And Snapdragon Elite will end up facing Lunar-Lake and Strix/Kraken anyway. The GPU lacks full support for DX12U features. Based on initial observations, it seems unlikely to be competitive with x86 alternatives. However, upcoming reviews may provide a different perspective.

Details on Qualcomm's Adreno X1 GPU Architecture Emerge (10)
Adreno X1 GPU Architecture: A More Familiar Face - The Qualcomm Snapdragon X Architecture Deep Dive: Getting To Know Oryon and Adreno X1 (anandtech.com)

#10
trsttte

I'm not really worried about the performance of this gpu, as long as it runs a desktop environment it's fine. It would be nice if it could come close to AMD apu offerings but like it, it's not supposed to be a 3d gaming powerhouse just like AMD and Intel versions aren't.

What I'm interested in is the entire real picture, they've been doing a slow drip of information for months with fluff pieces everywhere but no one has been able to verify anything independently. It's all "uhh seems great, very fast, much battery and cool" but no real performance comparisons, only claims from Qualcomm. GPU raw power is the least important part of that, the chip supports USB4 and has 8x pcie lanes available for a discrete gpu if that's your thing.

#11
TheLostSwede

News Editor

trsttteI'm not really worried about the performance of this gpu, as long as it runs a desktop environment it's fine. It would be nice if it could come close to AMD apu offerings but like it, it's not supposed to be a 3d gaming powerhouse just like AMD and Intel versions aren't.

What I'm interested in is the entire real picture, they've been doing a slow drip of information for months with fluff pieces everywhere but no one has been able to verify anything independently. It's all "uhh seems great, very fast, much battery and cool" but no real performance comparisons, only claims from Qualcomm. GPU raw power is the least important part of that, the chip supports USB4 and has 8x pcie lanes available for a discrete gpu if that's your thing.

Supposedly reviews will be out next week.

#12
Darmok N Jalad
trsttteI'm not really worried about the performance of this gpu, as long as it runs a desktop environment it's fine. It would be nice if it could come close to AMD apu offerings but like it, it's not supposed to be a 3d gaming powerhouse just like AMD and Intel versions aren't.

What I'm interested in is the entire real picture, they've been doing a slow drip of information for months with fluff pieces everywhere but no one has been able to verify anything independently. It's all "uhh seems great, very fast, much battery and cool" but no real performance comparisons, only claims from Qualcomm. GPU raw power is the least important part of that, the chip supports USB4 and has 8x pcie lanes available for a discrete gpu if that's your thing.

Yeah the lack of verification is concerning to me too. They are pricing these in a more premium market, and they continually make vague claims of faster that seem very cherry picked. Like the X is faster than the M3 in multithread. Okay, the X has more cores than the base M3, but not the M3 Pro and Max. Even then, that comparison is short-lived since M4 Macs are just around the corner. I think they really pressed to launch this now, as in a few months it’s going to look a lot less impressive with new products from QC’s peers launching shortly.

#13
Readlight

Witcher uses temporal postproces anti aliasing not Uber,supersampling. Uses hair physics, lots off shadows, lighting, vegetation range.

#14
trsttte
Darmok N JaladYeah the lack of verification is concerning to me too. They are pricing these in a more premium market, and they continually make vague claims of faster that seem very cherry picked. Like the X is faster than the M3 in multithread. Okay, the X has more cores than the base M3, but not the M3 Pro and Max. Even then, that comparison is short-lived since M4 Macs are just around the corner. I think they really pressed to launch this now, as in a few months it’s going to look a lot less impressive with new products from QC’s peers launching shortly.

They've enjoyed an exclusivity deal with microsoft for at least 5 years (original surface pro x launch, but probably more, a tomshardware articles says 2016) and have done basically nothing with it. Neither did Microsoft to be fair, the biggest hurdle was always going to be all the x86 legacy and that's on them, but this deal just forced innovation into a standstill. Google is doing cool stuff with chromebooks and is able to use cheaper offerings from mediatek and whoever - no they're not fast, but they dominated the education market with fast enough cheap devices - while Microsoft was sleeping at wheel.

Now the deal is set to expire (allegedly at the end of this year) they come out swinging with this big thing, at least it's not priced out of this world like the rebadged SQ chips and Surface Pro X always were, but it's still going for quite a high price bracket without anything to back their claims of improvement. Sure a 1200€ snapdragon elite laptop will be fast and cool, so would a ryzen 8945hs or 8845hs laptop if anyone bothered to put one in something other than a gaming brick

#15
GoldenX

This seems slow and slightly outdated. Plus I can almost guarantee that Vulkan driver will SUCK ARSE.

But if it at least is efficient, I could let it pass.

#16
Minus Infinity
TheLostSwedeSupposedly reviews will be out next week.

Well a batch of rpeliminary reviews are looking bad for X elite, slower than the iPhone 12 mini! Clocks were lower than they could, but still battery likfe was not great even wioth the low clocks. Looks like Qualcomm has been over-hyping the SoC much like AMD and RDNA3.

I have no doubt Strix Point 9945HS will destroy this in all metrics including iGPU.

#17
Denver

www.tomshardware.com/laptops/snapdragon-x-elite-in-the-wild-is-allegedly-slower-than-iphone-12-first-benchmarks-of-samsung-book4-edge-disappoint?

It looks like it could be a firmware problem, though.

"Before you grab your pitchfork and join the mob, there are multiple possible reasons for this disappointing performance. The most obvious is a hardware fluke. The CPU never boosted above 2.52 GHz in the Reddit tests. This is an obvious concern as the Elite X advertises boost clocks up to 4.0 GHz. The above X thread alleges that the Galaxy Book4 Edge physically limited the CPU speeds to keep cooling and battery performance maintainable, but this doesn't line up with other GeekBench scores for the Samsung laptop arriving today with some of the faster scores in the Copilot+ lineup."

But my torches are always lit /s

#18
trsttte
Minus InfinityI have no doubt Strix Point 9945HS will destroy this in all metrics including iGPU.

And as usual it will only be put to use in a couple of gaming laptops :(

#19
Darmok N Jalad
Minus InfinityWell a batch of rpeliminary reviews are looking bad for X elite, slower than the iPhone 12 mini! Clocks were lower than they could, but still battery likfe was not great even wioth the low clocks. Looks like Qualcomm has been over-hyping the SoC much like AMD and RDNA3.

I have no doubt Strix Point 9945HS will destroy this in all metrics including iGPU.

Yeah, I guess we'll wait for actual delivery day, but it's been a bit fishy from QC since they first started talking up this SOC. If the performance issues are true, and battery life isn't revolutionary, then what really is the value of these over anything x86 that you can confidently know will run all your stuff? MS is backpedaling a bit on Recall (a "first" feature on these QC devices), and if the performance on these is bad, then it's about the worst launch since, well, MS's first attempt at WOA with Surface RT. Lots of hype and effort for something no one will want at the price they are asking.

Denverwww.tomshardware.com/laptops/snapdragon-x-elite-in-the-wild-is-allegedly-slower-than-iphone-12-first-benchmarks-of-samsung-book4-edge-disappoint?

It looks like it could be a firmware problem, though.

"Before you grab your pitchfork and join the mob, there are multiple possible reasons for this disappointing performance. The most obvious is a hardware fluke. The CPU never boosted above 2.52 GHz in the Reddit tests. This is an obvious concern as the Elite X advertises boost clocks up to 4.0 GHz. The above X thread alleges that the Galaxy Book4 Edge physically limited the CPU speeds to keep cooling and battery performance maintainable, but this doesn't line up with other GeekBench scores for the Samsung laptop arriving today with some of the faster scores in the Copilot+ lineup."

But my torches are always lit /s

It would have to be something serious, or there might be legit fraud here. However, a FW issue isn't all that promising either, as it's not a good look to have serious performance bugs leading up to launch. When Apple went to Arm, they hit M1 out of the park. It was no-nonsense ready for business, with really good emulation and insane battery life advances, from day one. I would hope this wouldn't be a half-baked rollout with as much promoting as was going on. Unfortunately, half-baked is how a lot tech things get launched anymore.

#20
G777
Minus InfinityI have no doubt Strix Point 9945HS will destroy this in all metrics including iGPU.

In case you missed it, it's called the Ryzen AI 9 HX 370 now ;)

#21
Assimilator

Smartphone GPU maker says their smartphone GPU is faster than real GPUs, news at 11.

#22
AnarchoPrimitiv
trsttteSure a 1200€ snapdragon elite laptop will be fast and cool, so would a ryzen 8945hs or 8845hs laptop if anyone bothered to put one in something other than a gaming brick

This sums up my frustrations perfectly....we should have tons of products using these AMD APUs....ALONE without a dGPU, but there are not that many.

Also...what's with that graph? Why is it showing the 780m to be the worst while in reality it is faster than the Intel iGPU in actual games?

#23
Minus Infinity
trsttteAnd as usual it will only be put to use in a couple of gaming laptops :(

We'll see, I am seeing plenty of makers readying their AMD based laptops. Already plenty of Hawk Point based laptops available, and they're not all gaming.

#24
Tropick

It looks like some initial real world benchmarks are being released for the chips and there are some serious boost issues on certain laptops. Some of Samsung's units aren't letting the chip boost past 2.53GHz even plugged in. Seems like firmware/BIOS headaches are incoming

EDIT: It looks like some units are able to hit advertised clocks and are performing pretty close to Qualcomm's advertised scores

Add your own comment
Details on Qualcomm's Adreno X1 GPU Architecture Emerge (2024)
Top Articles
Latest Posts
Article information

Author: Neely Ledner

Last Updated:

Views: 5589

Rating: 4.1 / 5 (62 voted)

Reviews: 85% of readers found this page helpful

Author information

Name: Neely Ledner

Birthday: 1998-06-09

Address: 443 Barrows Terrace, New Jodyberg, CO 57462-5329

Phone: +2433516856029

Job: Central Legal Facilitator

Hobby: Backpacking, Jogging, Magic, Driving, Macrame, Embroidery, Foraging

Introduction: My name is Neely Ledner, I am a bright, determined, beautiful, adventurous, adventurous, spotless, calm person who loves writing and wants to share my knowledge and understanding with you.