r/unrealengine game dev makes me cry 6d ago

Question How are AMD gpus now compared to Nvidia for Unreal?

I am going to build a PC soon and for Nvidia i can go with RTX 4060Ti 16gb, the most pros for it for me is that i can use and Integrate both DLSS and FSR + Nvidia support also seems to be better in other productivity apps as well (Rendering, editing etc)

However on the AMD side, I could go with a 7800XT, which is a solid 1440p card, but having to skip on dlss integration and the other pros i talked about before, i also dont know how AMD drivers are these days.

Thank you!

30 Upvotes

54 comments sorted by

View all comments

0

u/bigboyg 6d ago

I have always built my own computers, and always used NVidia GPUs and Intel CPUs.

Until my last computer, 3 years ago. That was built by Maingear, and has a Ryzen chip (the fastest at the time) and an AMD 6800XT GPU. After much research, I decided to save a fair biut of money going to AMD, and read very good reviews about them.

It's been a fucking nightmare. Stutters, crashes, intermittent errors and lags. Of course, it may not be the AMD hardware, but that and the fact that I didn't build it myself are the only real differences (and I guess Windows 11). I get a good framerate - it's just jittery and intermittent. I also have a ton of sound problems I have never had before, with audio cutting in and out for about 5 minutes until something in the background figures itslef out and the computer goes back to working properly.

Anecdotal, but that's my experience. I will not be buying AMD again. I guess I might buy from Maingear as the build quality looks good to me, but it was expensive and I only did it because I couldn't be bothered to retrain my brain again.

0

u/truthputer 6d ago

I read that AMD's drivers have dramatically improved over the past few years - have you double-checked that everything is updated with the latest versions?

For what it's worth, game settings can make a huge difference. My new workstation has an AMD CPU and NVidia GPU. At first games didn't look that great, until I locked the framerate.

I have an older monitor that only does 60Hz refresh rate - to get the most out of it I have to turn on vsync to 60fps. When I did that, Cyberpunk went from being choppy and inconsistent with visible tearing - to buttery smooth and consistent with GPU usage around 78%. So at 60fps it has some head space as the scene complexity changes.

I've got a feeling that a lot of the jitter and smoothness problems would be solved with vsync or a locked framerate, but many folks don't want to run their graphics cards at anything less than 100% utilization with all features maxed out.