The AMD Ryzen 7 5800X3D Review: 96 MB of L3 3D V-Cache Designed For Gamers
by Gavin Bonshor on June 30, 2022 8:00 AM EST- Posted in
- CPUs
- AMD
- DDR4
- AM4
- Ryzen
- V-Cache
- Ryzen 7 5800X3D
- Zen3
- 3D V-Cache
Gaming Performance: 720p and Lower
All of our game testing results, including other resolutions, can be found in our benchmark database: www.anandtech.com/bench. All gaming tests were with an RTX 2080 Ti.
For our gaming tests in this review, we re-benched the Ryzen 7 5800X processor to compare it directly against the newer Ryzen 7 5800X3D on Windows 11. All previous Ryzen 5000 processor were tested on Windows 10, while all of our Intel Alder Lake (12th Gen Core Series) testing was done on Windows 11.
We are using DDR4 memory at the following settings:
- DDR4-3200
Civilization VI
Final Fantasy 14
Final Fantasy 15
World of Tanks
Borderlands 3
Far Cry 5
Gears Tactics
Grand Theft Auto V
Red Dead Redemption 2
Strange Brigade (DirectX 12)
Strange Brigade (Vulcan)
At 720p resolutions and lower, we are significantly (and intentionally) CPU limited. All of which gives the Ryzen 7 5800X3D and its 3D-Vache the chance to shine.
The addition of 3D V-Cache to one of AMD's mid-range chips makes the Ryzen 7 5800X3D a much more potent option in gaming, with much better performance consistently than the Ryzen 7 5800X. This is very much a best-case scenario for AMD, and as we'll see, won't be as applicable to more real-world results (where being GPU limited is more common). But it underscores why AMD is positioning the chip as a gaming chip: because many of these workloads do benefit from the extra cache (when they aren't being held-back elsewhere).
In any case, the 5800X3D compares favorably to its more direct competition, the Intel Core i9-12900K and Ryzen 9 5950X (which are both more expensive options). In AMD partnered titles, the Ryzen 7 5800X3D does extremely well.
125 Comments
View All Comments
Qasar - Thursday, June 30, 2022 - link
Makaveli, he wont, according to only him. the m1 is the best thing since sliced bread.GeoffreyA - Thursday, June 30, 2022 - link
Lor', the Apple Brigade is already out in full force.at_clucks - Saturday, July 2, 2022 - link
Look, if we're being honest the M line punches above its weight so to speak and yes, it does manage to embarrass traditional (x86) rivals on more than one occasion.This being said, I see no reason to review it here and compare it to most x86 CPUs. The reason is simple: nobody buys an M CPU, they buy a package. So comparing M2 against R7 5800X3D is pretty useless. And even if you compare "system to system" you'll immediately run into major discrepancies, starting with the obvious OS choice, or the less obvious "what's an equivalent x86 system?".
With Intel vs. AMD it's easy, they serve the same target and are more or less a drop in replacement for each other. Not so with Apple. The only useful review in that case is "workflow to workflow", even with different software on different platforms. Not that interesting for the audience here.
TheMode - Tuesday, July 5, 2022 - link
I never understood this argument. Sure some people will decide never to buy any Apple product, but I wouldn't say that this is the majority. Let's assume that M3 gets 500% faster than the competition for 5% of the power, I am convinced that some people will be convinced to switch over no matter the package.GeoffreyA - Wednesday, July 6, 2022 - link
I'd say it's interesting to know where the M series stands in relation to Intel and AMD, purely out of curiosity. But, even if it were orders faster, I would have no desire to go over to Apple.mode_13h - Thursday, July 7, 2022 - link
Yes, we want to follow the state of the art in tech. And when Apple is a leading player, that means reviewing and examining their latest, cutting edge products.Jp7188 - Friday, July 8, 2022 - link
Perhaps that could make sense in a seperate piece, but M1 doesn't really have a place in a gaming focused review. M1 gaming is still in its infancy as far as natively supported titles.Skree! - Friday, July 8, 2022 - link
Skree!mode_13h - Sunday, July 10, 2022 - link
I'm going to call spam on this. Whatever it's about, I don't see it adding to the discussion.noobmaster69 - Thursday, June 30, 2022 - link
Better late than never I guess.Am I the only one who found it puzzling that Gavin recommends DDR4-3600 and then immediately tests with a much slower kit? And ran gaming benchmarks with a 4 year old GPU?