Announcements
Attention for Customers without Multi-Factor Authentication or Single Sign-On - OTP Verification rolls out April 2025. Read all about it here.

GPU/CPU setup advice

vblevin
Enthusiast

GPU/CPU setup advice

vblevin
Enthusiast
Enthusiast

I'm trying to decide between 2 different cpu/gpu combos.

My models are under 200mb, and I'd like to start using rendering programs like D5

 

1. Ryzen 9900x and GeForce RTX 4070 Super

2. Ryzen 9950x and Nvidia RTX A2000

 

Any advice?

0 Likes
Reply
1,271 Views
32 Replies
Replies (32)

RDAOU
Mentor
Mentor

@vblevin wrote:

 

I wish I could find someone with my similar workflow who can say "I have an RTX ____ and it performs like this ____"

 


If that would help, I have at home 2 Desktops & 2 Laptops (all 3 personal and all 3 DYI) and 2 of which are close to what you mentioned:

  • one of the desktops has a GeForce RTX 3080Ti and Ryzen 9 5950X on an MSI X570S Ace max board, 164x2 GB Corsair Vengeance rams  liquid cooled (runs flawlessly) <--- the configuration of this one would be very close to what you described in performance
  • A Laptop (the mobile version of the above) with and Intel Core i9 and RTX A2000 mobile which I upgraded to RTX A4000 mobile due to performance issues...on the A2000 the GPU fan went nuts and was  almost on @ high speed all the time...the A4000 has better base/boost clock  and the 8 GB extra contribute a lot towards the difference

 

YOUTUBE | BIM | COMPUTATIONAL DESIGN | PARAMETRIC DESIGN | GENERATIVE DESIGN | VISUAL PROGRAMMING
If you find this reply helpful kindly hit the LIKE BUTTON and if applicable please ACCEPT AS SOLUTION


0 Likes

vblevin
Enthusiast
Enthusiast

@RDAOU wrote:

 

  • one is a desktop with RTX 3080Ti and Ryzen 9 5950X on an MSI X570S Ace max board, 164x2 GB Corsair Vengeance rams  liquid cooled (runs flawlessly) <--- its a 2 years old desktop but it would be very close to what you discribed in performance

Do you render with this desktop? Does it take long?

0 Likes

RDAOU
Mentor
Mentor

@vblevin 

 

I sometimes do when too lazy to work upstairs... Over the past four months, I've mostly been using my new rig, and the opt desktop more or less become a Cyberpunk and COD station, lol.

 

Not sure how one can benchmark the time for you to compare... Revit Rendering doesnt require much GPU power, its more CPU. For instance, the Revit Architecture Sample project (this two-story house):

  • Default settings on Best / Screen Resolution - 10MB (region capturing only the building and partial topo) - approximately 4 minutes in Revit 2025.
  • Using default material assets assigned when transferred to Twinmotion via the plugin - approximately 2 minutes.

But for GPU specifically, for the A2000, I did a test walkthrough and posted somewhere on this forum a few years back. The OP back then was also trying to check GPU performance before updating. Iโ€™ll try to look it up and repost it in an edit to this post (just to convince you to stay away from the A2000).

 

 

 

YOUTUBE | BIM | COMPUTATIONAL DESIGN | PARAMETRIC DESIGN | GENERATIVE DESIGN | VISUAL PROGRAMMING
If you find this reply helpful kindly hit the LIKE BUTTON and if applicable please ACCEPT AS SOLUTION


0 Likes

vblevin
Enthusiast
Enthusiast

The 4 min and 2 min is with the RTX 3080?

What's your new rig?

0 Likes

RDAOU
Mentor
Mentor

@vblevin 

 

Revit Photo Rendering is CPU not GPU ... Revit's built-in rendering engine, based on Autodesk Ray Tracer or previous Mental Ray engines, uses the CPU for rendering. Revit relies on the GPU for basic model visualization, like navigating a 3D view or working in shaded or realistic mode which requires rendering of textures, shadows, and basic lighting in real-time to display the model while working. Unlike Arnold, V Ray and Mental Ray which also use CPU rendering, the issue with Revit is that it does not use the full potential of the CPU.

 

To answer your question, yes on that PC it takes 2mins to render and external camera view of the house and site in Twin motion (same in Blender) and ca 4mins in Revit (Revit doesn't use the CPU's full capacity so you would need to overclock ... compared to Twinmotion, Lumion and Blender where rending goes much smoother and faster using Hybrid rendering CPU/GPU). In Revit Render is like having a Ferrari but running it on Diesel

 

My current is a Ryzen Threadripper 7970X Liquid cooled + SAPPHIRE AMD Radeon RX 7900 XTX on an MSI motherboard of course .

 

RDAOU_0-1737323856922.png RDAOU_1-1737323990831.png

 

 

AMD in my opinion are more value for money

 

 

 

 

 

YOUTUBE | BIM | COMPUTATIONAL DESIGN | PARAMETRIC DESIGN | GENERATIVE DESIGN | VISUAL PROGRAMMING
If you find this reply helpful kindly hit the LIKE BUTTON and if applicable please ACCEPT AS SOLUTION


0 Likes

vblevin
Enthusiast
Enthusiast

I use Revit LT, so I would be using  a program like D5 Render.

Is a B650 good enough or is it worth it to get a B670 mother board?

Thanks for all your input, BTW

0 Likes

HVAC-Novice
Advisor
Advisor

B670 is Intel, B650 is AMD. Did you by any chance mean the X670/E, which is AMD? 

 

the higher level chipsets allow more PCIe lanes. But really, the only lanes you really need are to the GPU and the SSD. and both have direct connection to the CPU. They don't go through the chipset. anything going through the chipset, will be slower. that is where USB and secondary SSD go. 

 

if you add a lot of extra cards, you have to look what limitations the MB has. Sometimes slots share PCIe lanes. but if it is only one SSD, and one GPU, it should not matter.

 

But look at the specific MB what it offers. Just because they use a chipset, doesn't mean they use all features. For higher power CPUs (especially if you enable PBO), the VRM (voltage regulators for CPU) design also matters. For specific MB etc., a computer forum may be better. Many people here will get a PC from the IT department, and there is not much choice. But on a computer forum, you have all the people that build their own PCs. 

 

I don't know if Revit LT even offers all the rendering options. I think testing your current hardware rendering and then comparing how that hardware compares to proposed hardware would be a starting point. Revit performance is very personal and hard to benchmark. my project could be a shed with no electricity or HVAC or any colors, or it could be a World Trade Center with all type of renderings and and MEP systems.... it is the same software, but the hardware requirements will be different.

 

Gaming performance is easy to compare. You take a few game titles, you determine the resolution and Shader/RT settings and for everyone playing the game with the same hardware it will be the same experience. But in Revit, everyone is playing a totally different game in totally different levels of detail. 

Revit version: R2025.4
0 Likes

vblevin
Enthusiast
Enthusiast

Revit LT does not have rendering, that's why I'll be using a rendering program like D5 Render.

They have user benchmarks for various GPU's. For example an RTX 4070 Super render time is 30-40 sec., and an RTX 3060 is 80 sec.

Although I don't know what exactly is being rendered, it does tells me something.

I'll probably just get the Ryzen 9900x and the RTX 4070 Super and be done with it (and leave you guys alone ๐Ÿ˜‰)

I spent way to much time researching this: time I could have spent earning $ to pay for the computer...

 

Here' the final set up if you care to offer any final thoughts

vblevin_0-1737141851930.png

 

 

0 Likes

HVAC-Novice
Advisor
Advisor

I'd use a single 2TB SSD instead of two 1TB. That saves m.2 slots, is cheaper, and gives more flexibility. You can partition the SSD to separate Windows and data. and if needed, you can change partition sizes. If you buy two separate 1TB SSDs, you are basically locked into a 1TB-1TB split.  And assuming your MB only has one m.2 slot that is directly connected to the PCIe lanes to the CPU, you don't have to decide which SSD you need to be faster. 

 

I don't know the Crucial T700. I doubt you need PCIe 5 (and make sure the MB offers PCIe 5 if this is important to you). 

Revit version: R2025.4

Christian_Santoso
Contributor
Contributor

if you prioritize raw performance then you can pick 1.
if you prioritize overall stability (no crashes), you can pick 2. 

both of them are good in their own way, the RTX A2000 being the workstation class graphics card, and RTX 4070 super being having more cores etc, but again its intended for gaming, so.. yeah.

Jr. Solutions Engineer
Autodesk AEC Building | M&E
0 Likes

ToanDN
Consultant
Consultant

@Christian_Santoso wrote:

if you prioritize raw performance then you can pick 1.
if you prioritize overall stability (no crashes), you can pick 2. 

both of them are good in their own way, the RTX A2000 being the workstation class graphics card, and RTX 4070 super being having more cores etc, but again its intended for gaming, so.. yeah.


There is no evident that GeForce RTX series are more prone to crashes that RTX A2000.  In fact, 3D games demand much more graphics processing power than Revit.  Being a 3-year old tech, slower clockspeed, less VRAM, less memory bandwidth, A2000 will crash more when handling heavy Revit project models.

HVAC-Novice
Advisor
Advisor

Crashes are caused by low-quality hardware, incompatible or extremely overclocked hardware, lack of cooling, and/or installation of a lot of "questionable" software.

 

There is no proof that a consumer GPU is more prone to crash than a "workstation" card. On the contrary, gamers run them a thigh load for hours on. In Revit, you only need the burst for a short time (unless you are rendering). 

 

Gamers have absolutely zero tolerance for crashes or imperfect performance. 

 

During the crypto-boom, al those miners bought all the consumer-grade GPU and happily ran them 24/7 at full load. 

Revit version: R2025.4
0 Likes

RDAOU
Mentor
Mentor

It depends on the type of applications each is being used for...There is proof as well as online reports for the following (from Nvidia as well as Independent test labs):

  • For real time rendering, the GeForce 3080Ti and 3090Ti can easily outperform the Quadro RTX A4000 & RTX A5000
  • For Scientific research Simulation, AI Model Training with large data base and Video editing and compositing in 8K the Quadro RTX series outperforms the GeForce.

 

Crypto miners are not a benchmark ... there are two main reasons why they go for the gaming GPU, the first is the advance refresh rate which is a game changer when it comes to mining and the second reason is their cost effectiveness compared to the Quadro RTX. The downside which they face, over heating and higher power consumption. You can have a data center right next door and sleep peacefully but 12 foot mining hub can keep the neighborhood up all night

 

 

 

YOUTUBE | BIM | COMPUTATIONAL DESIGN | PARAMETRIC DESIGN | GENERATIVE DESIGN | VISUAL PROGRAMMING
If you find this reply helpful kindly hit the LIKE BUTTON and if applicable please ACCEPT AS SOLUTION


0 Likes