What if AMD produced a desktop graphics card that effectively mirrored console GPU specifications? Would such a product guarantee console-level performance in the PC space, or is it the case that there are already console-equivalent GPUs currently available that already do the job - and if so, how much do they cost? It's been known for a while now that AMD's Radeon RX 6700 10GB (the non-XT) model has startling similarities to the technological make-up of the PlayStation 5's GPU - so a while back, I bought one used from eBay and the head-to-head results are certainly intriguing.
Looking at the core specs, the RX 6700 certainly looks like a ringer for the PS5's GPU - to a certain extent. Both have 2,304 shaders across 36 AMD dual compute units. Both are based on the RDNA 2 graphics architecture. Both have 144 texture mapping units and 64 ROPs. However, equally, there are differences. The RX 6700 has a 2.45GHz boost clock up against PS5's maximum 2.23GHz (though the way they boost is somewhat different). There are fundamental differences in the memory set-up too: the desktop GPU version of RDNA 2 has a different memory cache hierarchy, up to and including 80MB of Infinity Cache. However, PlayStation 5 operates over a much wider memory interface with more bandwidth, the caveat being that this resource is shared between CPU and GPU.
On paper then, the RX 6700 is marginally faster. However, there are a range of potentially confounding factors that make any kind of head-to-head benchmarking problematic - even beyond the spec variations. Sony uses a different GPU compiler and has its own graphics API, not using the typical DirectX12 or Vulkan used on PC. Not only that, while we're confident in our ability to match settings in any given game between PS5 and PC (often with developer assistance) for benchmarking purposes, some games may have unexposed settings we're unable to tweak on PC which may subtly impact our scores.