AMD Ryzen 7 5700X3D Review: 3D V-Cache for $250

Would like to know what GPU suits it best. 4090 is overkill, we can see it in the article. 4080/s probably either. But if it is for 3080/6800xt for example, how much is the difference between the 5700x and 5700x3d?

We need to find first GPU in the whole lineup that's not bottlenecking the 5700x3d, and then compare it to competitor CPUs. Then we'll find practical sense when we want to upgrade our rig on AM4, and if we need to choose between 5700x and 5700x3d. If it makes sense at all.

5700x is much cheaper, it clocks higher and can be oc'd (undervolted), and its tdp is 65w, as already noted in this topic. It's faster in many non cache sensitive scenarios.

Most points are valid for 5600 non-x.

Or in other words: what GPU is bottlenecked by the 5x00x but not bottlenecked by the 5x00x3d? And what is the difference? 3080/3090 probably? 10% in 1080p and 5% in 1440p? Where's the most valuable data? GPU and CPU scaling, I mean.

Those links include intel chips .. soooooo not really and ad-article for amd
 
I don't see what the point of the 5700X3D is, from my perspective. I have the 5700X, which can be had for less than $200 now. I find it to be an excellent performer for my usage.
 
For the price, this is a bangin deal for those still on AM4 looking to upgrade from Ryzen3000, IMPO!
 
Multiplier is locked on x3d parts. Always has been. No OC allowed.

Yes you can OC them - using Bclk. It works with boards that have a external clockgenerator and are not dependend of the CPU. At best you can squeeze out 104Mhz BCLK which would do approx 4.264Mhz and is still considered quite the uplift coming from stock and no voltage bump.

But, BCLK is risky; some NVME SSD's or NIC's cannot handle even beyond 103Mhz. The whole reason it's locked is because of the X3D Cache that is tied to the CPU Vcore. It does not allow more then 1.35V if I'm correct and frying these imminent once you go beyond that voltage.

It's no special CPU - AMD has a line of EPYC's where lots of cache for certain customers really is beneficial - these chips did not meet or qualify and are resold as consumer parts. That's where the whole X3D came from - they where CPU Server chips initially not meeting requirements.

Because of the low clocks on this chip - I think it's better to opt for the fastest you can get - the 5800X3d. Or even the 5950X if you want the best AM4 has to offer.
 
Zen 3 was not “always overpriced”, you could get a 5600 for $99 at one point, and 8-cores were in the mid $100s range for quite some time. The X3D chips represented a terrific end-of-platform upgrade for those of us who bought in at a prior gen, but for those of us on Zen 1/+, even the non X3D chips are a huge upgrade in gaming. X3D is always going to be sold at a premium since it’s the best gaming experience the platform has to offer.
5600X launched at $299 and 5800X was $449.
Not always overpriced, but it was at the worst time. Launch.
 
5600X launched at $299 and 5800X was $449.
Not always overpriced, but it was at the worst time. Launch.

Those were excellent, class-leading parts when they launched, combined with the fact that AMD was also trying to get rid of the
budget stigma that they still had at the time.

What was AMD supposed to do, price them instead at today’s prices instead and undercut their own existing stock at the time? Then screwing over their distributors and retailers? No. That’s not good business. They were priced strategically, and affordable alternatives from AMD themselves (3000 series) were still widely available.
 
Those were excellent, class-leading parts when they launched, combined with the fact that AMD was also trying to get rid of the
budget stigma that they still had at the time.

What was AMD supposed to do, price them instead at today’s prices instead and undercut their own existing stock at the time? Then screwing over their distributors and retailers? No. That’s not good business. They were priced strategically, and affordable alternatives from AMD themselves (3000 series) were still widely available.
You didn't read the techspot reviews and conclusions then. Or the day one comments.
 
You didn't read the techspot reviews and conclusions then. Or the day one comments.


Literally in the third paragraph of the 5800X review:

“If you're wondering why AMD has positioned the 5800X so poorly, the answer is simple: it doesn't make sense for them to sell it any cheaper.”

Again, this is business. This particular strategy worked for AMD for what they were trying to do. Regardless of how you or anybody else on whichever tech forum felt about the way they priced it! Either buy something else, or wait for the price to fall. Simple.
 
Literally in the third paragraph of the 5800X review:

“If you're wondering why AMD has positioned the 5800X so poorly, the answer is simple: it doesn't make sense for them to sell it any cheaper.”

Again, this is business. This particular strategy worked for AMD for what they were trying to do. Regardless of how you or anybody else on whichever tech forum felt about the way they priced it! Either buy something else, or wait for the price to fall. Simple.
Irrelevant. I'm not a business. I'm a consumer. Both were priced poorly.
 
A review made only for framerate junkies as the 4090 and 1080p gaming is quite overkill for most people. Before techspot made review in 4k for a cpu too but not this time. The idea is to show cpu bottleneck but not the real world usage for people playing in 1440p or 4k. I do not see much relevence to this review and so many more that dont understand tha 4k gaming is quite common today and most reviewers sont test it because they are only interested in how much a cpu bottlenecks. If I have one old but still good computer I want to know how it works in 4k gaming. I am not interested in 1080p bottleneck testing. Cpu is important in 4k gaming and the old idea that cpu doesnt matter in 4k gaming is just wrong and gamernexus is still clinging to that idea and it is so frustrating that the techreview community doesnt get it and test older cpu paired with newer gpus. Why change a perfectly good cpu if u dont need? Why spend that extra cash if u dont need and how much is it worth. I think the shift to reviewers testing in 4k will be done but it is the same old slow and plain dumb idea that was doing the slow shift from only testing 720p and forgoing 1080p as it didnt matter.... I have waited for 4k testing for so many many years but still there is very few that actually does it and also test it for a bit older cpus. I guess it is good for companies to make more money with an extra cpu and motherboard with every upgrade. I guess this plain and ridiculus idea makes sense and are shoved down our throats.
 
Back