Xbox One X bottlenecked?

5

Hi all,

So I’m not sure if Paul or anyone has discussed this yet on this site, but I’ve been thinking about it lately.

I own an Xbox One X, and I think the visuals are fantastic and, well, I don’t really notice frame rate issues.

But, I’ve read that the Jaguar (is that what it is?) CPU, even though it’s a little better than the one in the regular Xbox One, is kind of holding the GPU in the console back.

Think that’s true?

I don’t know, my understanding is that, for the most part, games generally rely on the GPU more anyway, so maybe it isn’t an issue, but of course, the two go hand-in-hand because you need a fast CPU to feed the GPU the data that it needs.

Thoughts?

Comments (5)

5 responses to “Xbox One X bottlenecked?”

  1. SWCetacean

    I don't doubt it. The Jaguar architecture is, like all pre-Zen AMD CPU architectures, not that good in terms of single-thread performance. That limits how fast the CPU can issue instructions to the GPU. I'd rate the Jaguar CPU as the second-slowest component in the Xbox One X (behind the spinning hard drive disk). It's beefed up compared to the CPU on the regular Xbox One, but higher clocks aren't going to fix an inherently slow microarchitecture.


    As to whether it's really bottlenecking the GPU, I think it's less of a bottleneck than it would be in a PC of equivalent specs. The reason is that Microsoft and AMD worked together to implement a bunch of DirectX command translations in hardware, so the main CPU doesn't need to go through the process of translating command bundles into low-level GPU instructions. So the weak CPU doesn't hurt Xbox One X performance as much as if you placed that same CPU in a PC and tried to run the same workloads.


    Now it's telling that the focus of both of the current gen+ consoles (XB1X and PS4 Pro) is higher resolutions. Resolution is pretty much entirely GPU-dependent. Nearly every enhanced title for Xbox One X has a higher resolution than the standard version; a minority of them allow for higher frame rates. Frame rate is more CPU-dependent than resolution (though the GPU also plays a significant role). And the algorithmic complexity of the games (number of objects in the simulation, complexity of the game systems and simulation) is rarely increased in the enhanced titles; expanding the simulation is entirely CPU-dependent. Thus the marketing push for these enhanced consoles lines up with their major hardware improvements: a huge increase in GPU power, with a smaller bump in CPU power. This lends well for enabling higher resolutions, but doesn't help much in increasing framerate or game complexity. So in that sense, the lower CPU:GPU power ratio is limiting the enhancements in games to graphics and resolution rather than framerate or complexity.


    I think it's interesting that Project Scarlett is supposed to enable 120 fps gameplay; pushing 120 fps consistently requires good CPU performance, and putting that feature in the marketing shows that Microsoft has some confidence in the level of CPU performance in the upcoming console.

  2. madthinus

    I think the single biggest improvement next generation will be the CPU cores. They are based on Zen 2 and the uplift will be fantastic. As someone that moved to a Geforce RTX 2070 recently and was disappointed with the performance, only to be godsmacked about the uplift from the card once I replaced the four year old CPU and motherboard combo. So going from Jaguar cores (which is to be honest, Atom class performance) to Zen 2 is going to be a big leap.

    • ErichK

      In reply to madthinus:

      Aw man ... I was kind of thinking the same thing recently about my gaming rig, but I just didn't want to admit it. Last year Xmas I bought a GTX 1070 to pair with my i5 4690K. Sometimes I think I need a more recent CPU to feed that GPU. But, I don't know. I have my i5 overclocked, and it's still a great performer, plus, the GTX 1070 is not as super-enthusiast as a 1080 or 2070/2080. My games run great, and that's all that matters.

Leave a Reply