Recently, I have had to perform some seriously intense 3d Modelling in AutoCAD. The drawing consisted of 300k+ 3d faces which made approximatley 1 million vertices.
At the time I was running a GeForce 960 GTX 2Gb card. The performace made using the drawing impossible to use (not to mention the increadible amount of crashing that I experienced) and I had to resort to spliting up the drawing. I also found the solution of importing the large file into 3DS Max and modelling in that environment (this solution works really well) and then exporting the model back to AutoCAD as a whole mesh.
I then asked the question: "If I had the recommended and 'certified' video card that AutoDesk suggests, will I get better performance and less crashes?"
So I bought the Nvidia Quadro K2200 (Double the price of a GTX 960) which is recommended for heavy use AutoCAD modeling. I refused to spend any more on a video card as it would not be cost effective anyway. This card has more memory but everything on paper is worse than the 960GTX and unfortunatly there is no real nice benchmarks and data to go by to make a comparison. I was hoping the optimised drivers and the ECC VRAM would help out.
This is my conclusion:
Performance wise there was actually a decrease with using AutoCAD using the K2200 compared to the GTX 960. However there was a significant increase in stability resulting in far fewer (actually non in my case) crashes. Also, there was a significant improvement to the performance with 3DS Max. The crashes alone saved some time but the reduced performance didn't help.
So to all AutoCAD users out there: DO NOT WASTE YOUR MONEY ON QUADRO GPU's (even if AutoCAD and Nvidia says you should). If AutoCAD is performing poorly on your GTX card it is because AutoCAD is poorly optimised and has limitations as a software package and it has nothing to do with your hardware. These limitations have been explored and solved in other AutoDesk software like 3DS Max and Navisworks. So to all Modellers out there: WASTE YOUR MONEY ON A QUADRO ONLY IF YOU WILL USE NAVISWORKS AND/OR MAYA AND/OR 3DS MAX (only if you going to use these packages exclusively). However, note that a high end GTX card will be more than adequate for the use of these three packages.
More detail not necessary as this is targeted at management in the hopes that they don't fall for the "Workstation" or "Professional" marketing strategies that both AutoCAD and Nvidia play with. The cards are not worth it if you are in Engineering modeling.
I'm glad to see you are running a GTX 960. What version of CAD are you running? I recently purchased a GTX 960 and upon startup of 2016 or 2017, AutCAD will stop working - meaning the program won't open. Do you have any insight?
It could be, like with many other software packages, that either there is a driver conflict (i.e. the computer registry still acknowledges that the other graphics card is still in use) or there is an incompatibility. In this case it is most likely the prior. Normally with such things, running a repair, installing the latest drivers, performing a system clean with CCleaner or reinstalling the software will work. Otherwise, hopefully someone else can help. Goodluck.
I am so psyched to see that someone has been looking at this, I do lots of 3d modeling and rendering with autocad-ACA2015.
I have been doing research before I commit to a new workstation for about 2 years now and was very curious to see the results,
I kept seeing these new k2200 and 4200 cards come along and was always like "I could do so much better if I just had that card and the ecc ram/motherboard/xeon.
its nice to know that the gtx will get me going in the right direction and I saw that the drivers are certified/available so very cool.
I wonder how the render time increase or decrease with that card though, did you see it speed things up once rendering has commenced?
recently I saw a you-tubers video on xeon vs. i7 also, and it totally made sense but still not sure what direction to go..
the guy said that if you don't intend to use the internal graphics, then find the comparable xeon and go with it, granted ecc ram id more expensive
than non-ecc, which I don't know if autocad likes due to it may need room for errors and ecc might cause it to stall/hang when encountering an error of some sort.
your thoughts?
You have to remember that AutoCAD was originally coded years ago and they have not updated the code since with only minor tweaks here and there that were backwards compatible with the newer software that Autodesk wrote. This means that AutoCAD is very poorly optimized for multi-core processing and does not utilize hyper-threading whatsoever. However, newer software like Maya, Navisworks etc does have this optimization. Whether or not ECC memory makes your experience any better I would highly doubt. ECC memory is not faster or cheaper. It is less volatile which means it is subject to less data loss. This makes ECC every attractive for servers where you can't afford to loose data (for example a medical aid server cannot just "loose" someones file due to a memory error). In the case of a workstation, in my opinion, the new multi-float point gpus (NVidia gtx 1080 and 1070) with a powerful i7 and slightly higher latency (say 2400Mhz which is now standard DDR 4 memory) is actually over the top for most people. If you are in high end rendering such as game development the, yes, a Quadro is the way to go but then you cannot hold back with the K2200 you'll have to purchase a much more powerful card than the K2200. The K2200 does not have double float point precision and has the same capabilities as the gtx 760.
With regards to speed. We saw zero increases in performance in AutoCAD with the Quadro series of gpus (expected as AutoCAD runs on DirectX now which is actually more suited for gtx cards). There was also no change in the amount of crashes we were experiencing. However, for high end rendering, the Quadro's do perform well but with the latest Pascal architecture with multi-precision floating point accuracy the gtx 1080 and 1070 will probably out perform any Quadro at everything until Quadro adopts this architecture. While your CPU and RAM are used intensively during the rendering process, the use of a Xeon CPU and ECC RAM (which to equate to the same i7 will be very expensive) I believe is redundant as the performance increase will be negligible and loss of data in the rendering process is not a problem. My final verdict for a good workstation: Quadro redundant; Xeon redundant; ECC RAM redundant; i7 skylake 6700K is the way to go with 16Gb of 2400Mhz DDR4 RAM and a NVidia GTX 1080 or 1070 depending on your budget (This will be an extremely good workstation and you getting value for your money). Remember, consumer grade hardware (i7s and GTX cards) has undergone less quality control as opposed to professional hardware (Quadros and Xeons). But this is not a problem because hardware goes out of date very quickly (and most consumer grade hardware has a manufacturers warranty of 1-2 years)these days so buying expensive professional hardware, again, is redundant.
thanks man! that was a lot of info, I have to read up on the new architecture, thanks for the tip, I know I can find a system like that for like 2200 or so,
bravo! below is what I typ. render for clients that took like 5 hours and would love to turn it into 1,
right now I use an hp-pavilion:
win 7 / 64-bit
processor: amd fx-6120 hexacore 3physical/3 virtual
graph. card hd7450
and 32 gig of ram
ive heard about the i7 extreme series cards but they are expensive too,
modeling can lag
and working on a 9 mb file is pretty slow switching pint tabs etc.
ill def. be checking out those gtx's
Hi mate,
Im looking to get solid works and a new laptop. I will be 3D scanning vehicles to design and build bull bars and accessories...
Would the GTX 1070 8gb graphics card be a better option than the Quadro M3000 card ?
Any help would be muchly appreciated.
THnaks
The 1070 is the card for you out of those two cards. The M3000 just doesn't have the amount of memory required for such things and is a far slower card than the 1070. The multi precision data point on the new pascal architecture make quadro's redundant except with the high end M6000 which has a ridiculous amount of memory on board which is required for rendering very cool movie scenes and stuff.
I was one the fence on this issue as well, and decided to try a gtx 1060.
For those that need convincing I just ran CADALYST benchmark on my system with a GTX 1060 3G ram card, it has an overclocked 6700k cpu,16gb 3000mhz ram and Samsung 950pro SSD my results astonishing compared to some others online I found that were using a MUCH more expensive E5-2697v3 and Quadro K5200.
With the benchmark it gives users something to compare
My benchmark score (i7 6700k, Asus 1060 gtx 3g,)
C2015 Total Index =745
3d Graphics Index =1806
2d Graphics Index =514
Disk Index =319
CPU Index =342
single loop time 8 min
MUCH higher end system(Xeon E5-2697v3, Quadro K5200)
C2015 Total Index =512
3d Graphics Index =1161
2d Graphics Index =383
Disk Index =263
CPU Index =242
single loop time 10 min
We are running Autocad 2016 with Cadworx 2016. I just rebuilt a HP Z420 and put in 32 gig RAM, 500 gig SSD and the Geforce GTX 1070. Working great and all of a sudden we are geeting a memory error that points to the video card. Unhandled Access Violation Reading 0x24e0. Actally that is how I ran across this form. We have done the graphicsconfig and 3dconfig and no change. Anyone have any suggestions? I am perplexed? I am an IT Manager and not an Autocad user. Wondering if there is an training for us hardware support people?
I believe when you first install Autocad, it register the GPU and other info from your system, but now is detecting a major change, so the start profile doesn’t match the hardware, it might even think that is a different computer. I think it’s a licensing issue. Maybe a reinstall will work. I don’t remember if Autocad has a repair option from the original CD.
Can't find what you're looking for? Ask the community or share your knowledge.