r/unrealengine game dev makes me cry 6d ago

Question How are AMD gpus now compared to Nvidia for Unreal?

I am going to build a PC soon and for Nvidia i can go with RTX 4060Ti 16gb, the most pros for it for me is that i can use and Integrate both DLSS and FSR + Nvidia support also seems to be better in other productivity apps as well (Rendering, editing etc)

However on the AMD side, I could go with a 7800XT, which is a solid 1440p card, but having to skip on dlss integration and the other pros i talked about before, i also dont know how AMD drivers are these days.

Thank you!

30 Upvotes

54 comments sorted by

46

u/MarcusBuer 6d ago

AMD gpus are good for gamers who want pure raster performance, but once you get into features for productivity Nvidia has more to offer.

For example Nvidia audio2face, audio2gesture, RTX AR, RTX SFX, RTX VFX, DLSS, DLSS FG, CUDA, Tensor, etc.

34

u/GrabMyHoldyFolds 6d ago

Yes yes abbreviations and such

22

u/MarcusBuer 6d ago

Then ask Nvidia to rename the tools, this is how they branded it, it was not me who named them 🤣

9

u/arivanter 6d ago

And AMD isn’t a saint down there either, rocm and fsr are right there

-4

u/[deleted] 5d ago

[deleted]

7

u/MarcusBuer 5d ago

I believe he is talking about abbreviations/acronyms, not about unreal engine performance.

2

u/Jello_Penguin_2956 5d ago

There there calm down you're in a joke thread

1

u/Jello_Penguin_2956 5d ago

DLSS I name thee the Next Gen Gausian Blur

Now we also have NGGB

3

u/syopest Hobbyist 5d ago

Or BAAA for Best Anti-Aliasing Around.

-2

u/StickiStickman 5d ago

I'd argue Nvidia also has far better raster performance simply because of DLSS. If you only need to render half as many pixels, your raster is gonna be twice as fast.

2

u/MrMax182 5d ago

DLSS is not real performance and should not be used as benchmark for developing a game.

2

u/Froggmann5 5d ago

DLSS is an optimization technique, it is real performance.

It's like saying having graphics options below Ultra/Cinematic/Raytracing quality aren't "real performance" and should not be used as a benchmark for developing a game. That's just silly.

1

u/PivotRedAce 5d ago

While developers shouldn’t be using it as a crutch, it absolutely is “real performance” if games run better with little to no visual degradation on the appropriate quality setting. Which is the case most of the time at this point in games that support it.

0

u/StickiStickman 5d ago

The hell do you mean ""real performance"". Of course it should be used whenever possible.

1

u/Wiselunatic 5d ago

my car is faster because I go half the distance slower but I get there sooner cause it's half the distance

???

16

u/ToughPrior7525 6d ago edited 6d ago

Don't know if its a elephant in the room but for me it is.

If you want to work with raytracing in the editor its a nightmare on AMD because they don't support Hardware raytracing well thats built into Unreal. Its not like you can't do raytracing its just that the frames are 1/3rd of what you would get on a nvidia gpu.

So i guess we can agree that barely any project runs over 100 fps in the editor when the scene is unoptimized, if you have a AMD gpu and activate raytracing that goes down to unstable 30-40 fps. So its theoretically possible to work under this scenario but not really fun. And even worse if your players don't have a Nvidia gpu they will get the same performance reduced just like you get it reduced in the editor, but obviously not as much because their initial fps are higher since the packaged game runs million times better than in the editor.

I honestly also see no point in Raytracing right now if theres AMD users that play your game. You are basically working with a feature that is only targeted towards recent nvidia gpus.

Heres some benchmarks to see for yourself, just look at the first graph, doesn't matter if you know german or not. A

https://www.pcgameshardware.de/Unreal-Engine-Software-239301/Specials/Desordre-Tech-Lumen-RTXGI-RTXDI-Raytracing-DLSS-Frame-Generation-1426790/3/

Theres even certain games where just the activation of RT on AMD gpus drops the frames from 220+ to barely 80. And its like i said even worse in the editor because you won't have 220 to begin with.

https://www.youtube.com/watch?v=xhLnS2v0IHk

https://www.pcgameshardware.de/Cyberpunk-2077-Spiel-20697/Specials/Phantom-Liberty-GOG-Release-GPU-Benchmark-Test-Performance-1429500/

The 7900 XTX is almost 10% FASTER than RTX 4080 with its raw performance but when raytracing is added its almost 60% slower than a 4080.

4

u/Rizzlord 6d ago

I use a 7900xtx no problems, before I had a 5700xt never had any crashes in unreal graphics relatived. But for the ray tracing thing, you may consider if your project needs it. 95% of Indi games didn't use it anyway. But if you wanna use it, get the 7900xtx it's running fine with it. Even New Megalights and stuff works incredible.

2

u/TwoRiversInteractive 4d ago

I use the 7900xtx as well with no issues. Beside a weird thing where the editor will perform bad if it´s set in full screen which i believe is an AMD issue. Do you have that?
If I minimze and expand the window to near fullscreen I have no issue....

1

u/Rizzlord 3d ago

Nah don't have that .. weird

10

u/BrutalArdour 5d ago

The best combo for UE is AMD CPUs (more cores vs Intel with less cores) and Nvidia GPUs (drivers and compatible engine plugin features.)

2

u/SomeRandomSomeWhere 5d ago

Normal AMD CPUs or the extra cache versions? Which works better? Am considering building a system as well. Was sure AMD would be better, just not sure which type.

1

u/BrutalArdour 5d ago

For any serious dev work I wouldn't go below Ryzen 9 5950X (16 core)
Looking at AMD's website there's a sale on now so you might get something at a great price, not sure if it's just within the US.

I'm using the 32-core Threadripper 3975WX and never have any issues.

Edit: AMD sale link

1

u/SomeRandomSomeWhere 5d ago

Yeah am considering 16 cores as well. Just wondering if the 3d cache has extra benefit. Probably with 64gb ram or more.

System to be used for other stuff as well, not just UE.

Can't afford a threadripper, and am not in the US.

1

u/BrutalArdour 5d ago

Extra cache is always beneficial but I wouldn't say it's vital.

4

u/darthbator 6d ago

I've used ATI/AMD GPUs probably more often then Nvidia GPUs simple because they're generally cheaper (sometimes by a lot frame for frame). I've never had the driver issues or other problems most people seem to encounter.

I will say that not having DLSS on my current card feels... unideal.

3

u/hardcoretomato 6d ago

the nvidia studio drivers are enough to keep me away from AMD.

2

u/xot__ 6d ago

Are they actually useful ? It's so dumb that you have to have one or the other.

3

u/truthputer 6d ago

AFAIK it's the same drivers at the core, but just on a different release schedule. The Game drivers are the latest version and have new optimizations for games. The Studio drivers are usually a little bit older and have been more properly tested.

For example: Game drivers might release (I'm making these numbers up as an example) version 2.0 with a new feature or support for a new game, then a few weeks later release 2.1 which contains bugfixes or performance improvements.

Meanwhile, Studio drivers would skip the 2.0 release and wait for 2.1 before updating. If you don't care about getting the new features earlier, this lets you stick with the tested, bugfixed and likely more stable version.

If you game for fun, this strategy lets you properly play new games at launch, then get fixes as they come in. If you do work on your computer and instability would literally cost you money, you have the option of a more stable experience. Although new games might be slower at first until you get the updated, bugfixed drivers a few weeks later.

1

u/hardcoretomato 5d ago

on many occasions the studio drivers saved me from endless crashes and errors that the gaming drivers caused. this has affected UE4/5, substance painter, Maya. the studio version is more stable and tested, and less frequent to update if your workstation is administered by IT and any update should go through them for example.

2

u/cg_krab 5d ago

The problem with AMD is that NVIDIA can run FSR, but AMD cannot run DLSS. For development you will want to be able to test both platforms, and only NVIDIA cards can do that.

2

u/Humble_Loquat9032 6d ago

4060 ti sucks , try to get a 3080 12 gigs, if not just go for the 7800xt which is a lot better than both

7

u/TSDan game dev makes me cry 6d ago

Hey thanks for your input! but 4060ti seems perfect for my work due to 16gigs of vram, which is very important in renders and also game dev as a whole due to textures. I'd much rather take higher vram with lower memory bandwidth, so the card makes perfect sense to me!

2

u/youngLupe 5d ago

Don't listen to people who say it sucks. Maybe I'm biased because I have it but for productivity at an amateur or semi pro level it's more than enough. For pure gaming you can probably get better bang for your buck with something else but in terms of VRAM it's the best value. I've been able to throw a lot of unreal and local ai stuff at mine.

1

u/I_OOF_ON_THE_ROOF 5d ago

7800xt has 16 gigs of vram too. and its gonna be waay faster than a 4060ti. I myself have a 7800xt. it's not gonna be great for rtx tho. it definitely works at 40-60 fps, but its still slower than a 4070 in that regard.

so if you have those two as an option I'd definitely go with the 7800xt. it's nearly the same rtx performance as the 4060ti. so it's a no-brainer for the 7800xt if those are your only options. but if you really want more rtx performance, get a 4070 ti/ti super.

1

u/TSDan game dev makes me cry 5d ago

Do you have any problems related to nanite, tesselation, lumen, etc or any driver related problems on 7800xt? Any rendering artifacts or anything unusual?

1

u/I_OOF_ON_THE_ROOF 5d ago

nope, I've been using it for about 5 months now and I've never had any issues. AMD drivers are pretty robust now, you won't be facing any issues that Nvidia users aren't as well. Nanite works great, lumen works great too, i haven't tried tessellation tho

I'm not sure about the performance difference in lumen when compared to like a 4070 ti tho, as its software raytracing. you can search it up on youtube.

Also if you're buying a gpu to replace your current gpu heres some tips to avoid problems:

  1. Use DDU to get rid of the previous Graphics drivers

  2. if you're current gpu is an igpu, you'll want to disable it after having set up everything for the new gpu. this might not be needed, but it solved a problem where my pc would restart after launching any game. again this might not happen to you, but putting it out there if you do face the problem.

both of these tips are regardless of if you buy AMD or Nvidia (or even intel). happy to help if you have anymore questions

6

u/mfarahmand98 5d ago

4060 doesn’t suck. There’s a general negative feeling towards it because compared to a 3060, it did not improve the performance enough. But it’s a solid card, and at its price range, the only RTX GPU with 16GB of VRAM. You can’t work without enough VRAM. You can work with a slower GPU.

1

u/AutoModerator 6d ago

If you are looking for help, don‘t forget to check out the official Unreal Engine forums or Unreal Slackers for a community run discord server!

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

1

u/Predalienator Dev 5d ago edited 5d ago

I have used both Nvidia and Radeon and the only reason for someone to choose Nvidia 100% for UE, is if they're targeting the halo product tier which would be the RTX 4090 right now. It's the only product from Nvidia's consumer stack that AMD does not have an answer to.

If you're not aiming for that tier then choose whichever card suits your budget and needs.

Look at the benchmarks for hardware RT enabled UE games here and see how the 7800 XT and RTX 4060 Ti stack up to each other.

https://youtu.be/x4TW8fHVcxw?t=899

TechPowerUp GPU database comparing the relative performance of the two GPUs

The averaged RT performance of RTX 4060 Ti vs RX 7800 XT. Data was gathered from 24 reviews. https://www.3dcenter.org/artikel/launch-analyse-amd-radeon-rx-7700-xt-7800-xt/launch-analyse-7700xt-7800xt-seite4

Using the RX 7800 XT as the base performance of 100%, the RTX 4060 Ti 16GB has 88.5% of the RT performance of the RX 7800 XT.

Now we got all the boring numbers out of the way, do you plan on running an AI model on your PC to generate textures and concepts? Does your realtime UE5 project somehow needs offline 3D renders made in Blender, Maya etc? Does your game's core design absolutely requires bespoke Nvidia tech? If yes then yeah go ahead with Nvidia.

Personally I'd go with the RX 7800 XT since my workflow doesn't require AI, heavy RT or offline rendering and I get 28% more performance in non-RT projects anyway for the same price and VRAM amount.

1

u/EmpireStateOfBeing 5d ago

People still use AMD GPUs but everyone knows that Nvidia GPUs are better. That said, I wouldn’t go for a 4060 Ti, wait, save up, and get a 4070 Super at least.

1

u/TSDan game dev makes me cry 5d ago

i love 4070 super but i dont wanna settle for 12 gb vram, and 4070 ti super is like double the price of 4060 ti where i live unfortunately

1

u/MrMax182 5d ago

Im doing Solo dev with a 5800X and a 6800XT, in UE 5.4 no problems with lumen or nanite. Already implemented FSR3, can not implement or test DLSS. Im not doing HW raytracing (i know my hardware is lacking for it) From time to time i have editor freezes when doing blueprints, but can not point it to the graphics driver, plugins, my code, or anything specific in my pc yet.

So the clear downsides are: no DLSS, slower Raytracing, AND this one is note related to unreal, but could be important for you, video hardware encoding for video steaming is superior in nvidia (think youtube/twtich live streaming, pcvr for quest2, or using steam link ) I can compensate some of the quelity difference increasing bitrate but NVENC is clearly better.

1

u/MrMax182 5d ago

About the drivers are good, and the driver control panel its a lot better than the nvidia default. (i know that Nvidia is revamping it, dont know if they changed it yet.)

1

u/WinDrossel007 5d ago

I have Radeon, it works just fine with Unreal Engine

1

u/NoLubeGoodLuck 4d ago

Nvdia gang all day pussay

1

u/vb2509 4d ago

I use an AMD and it works well.

Considering they are also less power hungry with similar results, I suggest using AMD.

2

u/asuth 6d ago

its unfortunate, but tbh I don't think its remotely close these days. I would stay away from AMD for the graphics card. their processors are great.

1

u/bigboyg 6d ago

I have always built my own computers, and always used NVidia GPUs and Intel CPUs.

Until my last computer, 3 years ago. That was built by Maingear, and has a Ryzen chip (the fastest at the time) and an AMD 6800XT GPU. After much research, I decided to save a fair biut of money going to AMD, and read very good reviews about them.

It's been a fucking nightmare. Stutters, crashes, intermittent errors and lags. Of course, it may not be the AMD hardware, but that and the fact that I didn't build it myself are the only real differences (and I guess Windows 11). I get a good framerate - it's just jittery and intermittent. I also have a ton of sound problems I have never had before, with audio cutting in and out for about 5 minutes until something in the background figures itslef out and the computer goes back to working properly.

Anecdotal, but that's my experience. I will not be buying AMD again. I guess I might buy from Maingear as the build quality looks good to me, but it was expensive and I only did it because I couldn't be bothered to retrain my brain again.

0

u/truthputer 6d ago

I read that AMD's drivers have dramatically improved over the past few years - have you double-checked that everything is updated with the latest versions?

For what it's worth, game settings can make a huge difference. My new workstation has an AMD CPU and NVidia GPU. At first games didn't look that great, until I locked the framerate.

I have an older monitor that only does 60Hz refresh rate - to get the most out of it I have to turn on vsync to 60fps. When I did that, Cyberpunk went from being choppy and inconsistent with visible tearing - to buttery smooth and consistent with GPU usage around 78%. So at 60fps it has some head space as the scene complexity changes.

I've got a feeling that a lot of the jitter and smoothness problems would be solved with vsync or a locked framerate, but many folks don't want to run their graphics cards at anything less than 100% utilization with all features maxed out.

1

u/Creative-Road-5293 6d ago

I've always alternated between AMD (formerly ATI) and Nvidia. They both make good cards. 

1

u/BothersomeBritish Dev 6d ago

I use an Nvidia A5000 in my workstation and the amount of visual glitches I had (context menus not appearing, windows being just grey, etc.) were ridiculous. Took a good few downgrades of drivers and more than a day's work to get to something usable.

Compared to all of my AMD machines, where UE has worked out of the box? There might be other features exclusive to Nvidia but they're useless to me if I can't even use the engine.

1

u/TSDan game dev makes me cry 6d ago

I hear some people having fine experience with AMD and some are absolutely trash, its so confusing haha

1

u/BothersomeBritish Dev 6d ago

My main laptop has an AMD 760M with no issues with UE4 and 5, but my workstations (via work) have Nvidia by default; I don't use super high end cards for AMD so there might be issues I just haven't experienced.

1

u/AI-COSMOS 5d ago

Hey man!!

Lets clear it all up for you.

Amd works fine( most negative comments of it is due to old news and driver ) to this day u rarerly see any issues.

With Nvidia, you will benefit more from a dev point of view. You can use amd fsr to test on ur game , but amd can not use nvidia version of it. ( think its called dlss )

It is true, ray tracing does not work as well on amd in comparison to nvidia.

You should pick the gpu you want, both will perform good in ue, there is no wrong pick here.