Area :: Header
Discover the Difference with
3ds Max 2015
Upgrade now and save
Discussion Groups

Installation - Hardware - OS

Reply
Valued Contributor
Kojima_Pro
Posts: 55
Registered: ‎03-26-2008

NVIDIA question.

22 Views, 18 Replies
04-03-2008 02:39 AM
I have a NVIDIA graphics card, but when i try to install: "NVIDIA MAXtreme™" i get this error: NVIDIA %P setup could not detect an NVIDIA workstation graphics card to use with NVIDIA %P(tm). Please install an NIVIDIA graphics card.

Why is it that they say i have no NVIDIA graphics card...WHEN I DO! i even have a NVIDIA media center on the bottom right of my desktop right next to the time of the day which is 10:37.

Can sombody please explain to me why this error occurs?
Please use plain text.
Distinguished Contributor
Maneswar_Cheemalapati
Posts: 1,081
Registered: ‎10-28-2002

Re: NVIDIA question.

04-03-2008 02:55 AM in reply to: Kojima_Pro
Maxtreme only works on Nvidia Quadro graphics workstation cards. Not on Nvidia consumer game cards.
Please use plain text.
Valued Contributor
Kojima_Pro
Posts: 55
Registered: ‎03-26-2008

Re: NVIDIA question.

04-03-2008 02:59 AM in reply to: Kojima_Pro
So is there a driver thats better "Proformance wise"? that supports a lower version then Quadro? I installed my NVIDIA card myself...
Please use plain text.
Moderator
Steve_Curley
Posts: 19,009
Registered: ‎08-07-2007

Re: NVIDIA question.

04-03-2008 10:07 AM in reply to: Kojima_Pro
Just install the latest drivers from nVidia.

Max 4.2 through 2015 (SP2+EXT1), Composite 2014.
Win7Pro x64 (SP1). i5-3570K @ 4.4GHz, 8Gb Ram, DX11.
nVidia GTX760 (2GB) (Driver 335.23).

Please use plain text.
Distinguished Contributor
eodeo
Posts: 1,165
Registered: ‎09-05-2006

Re: NVIDIA question.

04-03-2008 02:56 PM in reply to: Kojima_Pro
Maxtreme is just a name now. It used to add performance under openGL a while back when D3D was still the underdog. Since that changed, gaming cards overtook the workstation cards in performance, and maxtreme adds little to no benefits to Quadro line of the cards.

Like i said- maxtreme is just a name now.
Please use plain text.
Distinguished Contributor
andy.engelkemier
Posts: 314
Registered: ‎08-22-2006

Re: NVIDIA question.

06-03-2008 02:03 PM in reply to: Kojima_Pro
You couldn't be More wrong.
I keep looking to see when maxtreme will be out for 2009. Whatever the maxtreme drivers do, they do it extremely well. It's not just a name.
I use it, and have used it for quite a while now. I open the same scene my coworkers do. The scene is very heavy and they have to display as box just to pan around the scene. I opened the same scene, shaded the scene, and rotated around with no problem. They are using D3D with the same Quadro FX 3450 I have. This is not just in once case either.

Yes, for many poly objects, like heavy characters the Maxtreme driver may not help. But for heavy architectural scenes, or in my case product scenes, it does an amazing job. We are also usually the people who usually have workstation cards since we also use programs like ProE, Rhino, Studio Tools, and other programs that really take advantage of the card.

Even with Max 2009, I still think their viewport speed blows. Or at least it will until I get my maxtreme for it.
Please use plain text.
Distinguished Contributor
eodeo
Posts: 1,165
Registered: ‎09-05-2006

Re: NVIDIA question.

06-03-2008 03:09 PM in reply to: Kojima_Pro
You couldn't be More wrong.
Whatever the maxtreme drivers do, they do it extremely well. It's not just a name.
I use it, and have used it for quite a while now. I open the same scene my coworkers do. The scene is very heavy and they have to display as box just to pan around the scene. I opened the same scene, shaded the scene, and rotated around with no problem. They are using D3D with the same Quadro FX 3450 I have. This is not just in once case either.


I’m going to have to stay skeptical here and say that there must be some oddity with your test. Your graphic card q fx 3450 is equivalent to gf 6800. I’m not really sure how qfx 3450 goes against qfx 4000, but since they’re both gf series 6 end 800, I’m going to say that they are within 10% of each other. I’m also going to say that my 6800gt is faster than my qfx 4000 or at least as fast. Only place where quadro is faster (not noticeably, but faster nonetheless) is openGL and SPECViewperf test. In is d3d to maxtreeme comparison as well as graphic to graphic card comparison they are about the same, with slight edge to 6800gt due to faster memory and core.

I’m going to suggest reading my post here.

In case you don’t feel like reading much, know just this- any 100$ card today will be at least 10x faster than your current card. In case of ati HD3870(180$) speed is going to be about 50x faster (actual and not an exaggeration)

Even with Max 2009, I still think their viewport speed blows. Or at least it will until I get my maxtreme for it.


May I assume that you're using windows Vista? Or at least the same 3 series old geforce :smileywink: ?

P.S. For Quadro to GeForce comparison go here.
Please use plain text.
Distinguished Contributor
andy.engelkemier
Posts: 314
Registered: ‎08-22-2006

Re: NVIDIA question.

06-03-2008 03:33 PM in reply to: Kojima_Pro
I am not using Vista, but we only use workstation cards here. We do product design and use ProE. it's kind of a must as that works primarily on OGL.
10x faster? Is that like a 3ghz computer will render 3x faster than a 1ghz computer? Because we all know there isn't a bit of truth to that.
there is a huge difference between benchmark testing and actual use. What I'm talking about here is actual use. I have an old 3 series geforce at home on a super old computer. It still kicks butt if you just compare it to one of the "faster" $100 cards today so long as you don't throw bump mapping and things like that at it. It just doesn't have the instruction set for that. But plain old 3D data, it does great. I don't use it anymore, but I thought i'd throw that in there for example. I loaded up 3dsmax on a friend's computer who had a more recent card and his didn't handle my data Any better than my old 3 series geforce. So what's this extra speed for?

I really don't care what the specs are. Show me a card that works, show me practical application and speed that way and I'm sold. Just telling me it's faster means nothing. I've seen benchmarks before that didn't pan out in the end. Andy why do so many cards get a huge improvement in actual speed when they get new drivers? Their specs didn't change.
Nvidia, although I don't like it, basically rips us all off because they write different drivers for the workstation cards even though they are pretty much the same. But they also consistently run faster at many of our tasks.

In my every day use I use max with a plugin called nPower. If you aren't familiar with it, it keeps the nurbs data inside max editable. That way I can change the polygon resolution whenever I want. It works well with our workflow. Straight poly's, I haven't noticed much of a difference, but when keeping the original npower data, which is nurbs based, I have noticed a huge increase in speed when using the maxtreme drivers. So for me, it's worth it. Not all workflows will benefit. I won't disagree there.

We will be using Autodesk showcase for something soon, and there it just looks like straight up speed and Tons of memory, so we are thinking about building an SLI rig with a bunch of 500 dollar 1gig gamer cards. I'm not confident it will work, but I can't argue with the specs....yet.
Please use plain text.
Distinguished Contributor
eodeo
Posts: 1,165
Registered: ‎09-05-2006

Re: NVIDIA question.

06-03-2008 04:11 PM in reply to: Kojima_Pro
Is that like a 3ghz computer will render 3x faster than a 1ghz computer?


Similar but no. It’s more of a: is a quad core cpu really 4x faster than a single cored one? And in case you don’t know answer to that, let me be the first to tell you- current intel core2 quad 6600 running at 2.4ghz is 90 times (90x!) faster than intel Pentium 4 Prescott 320 (single core) also running at 2.4 ghz (for 3ds max rendering). The technology advances so fast that you really cant compare apples to apples anymore. And for a more fair question (although irrelevant) is 4 cored core2 quad 6600 4x faster than itself running on only one core? Answer is No. It’s about 380% faster- ie 20% shy of full 4x.

have an old 3 series geforce at home on a super old computer. It still kicks butt if you just compare it to one of the “faster” $100 cards today so long as you don’t throw bump mapping and things like that at it.


I agree that there is a limit to how much juice you need. But you were the one that threw in “cannot pan without display as box”. If so, you need a faster card. If not, no amount faster card will matter. Its like ATi and nVidia now. ATi is 3x faster but its 600fps compared to 200fps – both much higher than needed 30fps.

Show me a card that works


I’d be glad to… but since distance is likely a problem, getting a 100$ card for yourself might be cheaper than 200$ ticket to here :smileywink:

We will be using Autodesk showcase for something soon, and there it just looks like straight up speed and Tons of memory, so we are thinking about building an SLI rig with a bunch of 500 dollar 1gig gamer cards. I’m not confident it will work, but I can’t argue with the specs....yet.


SLI never worked in Max and I don’t think that’s about to change.
And for the record any 500$ GAMING card you buy today will be tons faster in anything you throw at it (given it has same/more ram- not a problem since your card has 256mb of ram as I understand it) Today- you cant find a 200$+ card with less than 512mb of ram.

In conclusion, try any of the latest cards and see how well they do. I agree that no amount of benchmarks relates to actual speed at hand. (which reminds me how a “slower” AMD 2500+ cpu feels 2x faster than intel p4 @ 3.6ghz under regular day to day use. No test will show you that.)
Please use plain text.
Distinguished Contributor
andy.engelkemier
Posts: 314
Registered: ‎08-22-2006

Re: NVIDIA question.

06-03-2008 05:40 PM in reply to: Kojima_Pro
90x faster? Really? I mean, Really? So If I do a 1 minute rendering on on my quad core computer then run it again on the single core computer it will take 90 minutes? Dude I don't think so. i could run the same rendering on my old P3 1Ghz computer and it won't take 90 minutes. that's just ridiculous.
I agree with a lot of what you are saying but be reasonable. Try not to put your opinion where fact should go. Or maybe you are getting confused with percent and times? Also, I've got a dual 2.0 Ghz computer and a dual dual core 3.2ghz computer. They do the same rendering and I only see about a 10-25% increase in speed depending on the render. according to what you are saying I should have seen about a 260% speed increase.

the reason I threw in the fact that they had to display items as box is because it's the same card. We are using the same card, but with different drivers I can do Much more. So why is mine so much faster? Drivers. Not hardware speed.

Also, the SLI I was talking about isn't for max. It's for Autodesk showcase. i haven't looked into that enough to know if it's supported. it was just a thought. I'm going to see if we can get a gaming card to compare and see what kind of differences in speed I can get. My original argument was for maxtreme. Those drivers are Much better than the drivers that come with the quadro cards. I haven't yet compared them with that of a gamer card. It'll be a while till I can do that though, since money is hard to come by unless it's easily justified. I'll try it out eventually though. Also, I'll only be able to test it on the type of scenes we work on, so it won't always be true for everyone.
Please use plain text.