I was just flicking through the released 2014 text files and spotted a few phrases that raised eyebrows and made me concerned whether there's some fun and games for some gaming cards and/or laptop users...
http://images.autodesk.com/adsk/files/autodesk_inventor_2014_readme_enu.htm
^ 2014 readme (covering installation info, notes and info on some potentially common-ish bugs in the released build)
1) ATI cards:
For machines with certain ATI/AMD graphics cards, to avoid issues in the graphics window, use the Performance or Compatibility graphics setting instead of the Quality setting.
What ATI cards? Are they ATI gaming cards, workstation, all cards or what? or is it a vague get out of jail that ATI cards may not perform as well as Nvidia cards as a whole?
2) Optimus laptops:
For laptop computers with both an NVidia GeForce GPU and an Intel integrated GPU, change graphics from discrete GPU to Intel HD integrated graphics to ensure that Inventor 2014 works properly.
If all Iv2014 Nvidia laptop users are required to use the integrated gpu and not the Nvidia card they've paid for, then I'm guessing there will be a considerable number of complaints and concerns...
For a few years now, Nividia have been using "Optimus" technology to all auto switching between the i5/i7 integrated graphics chip and the laptop's Nvidia card (to maximise battery life when no 3d graphics are being processed). Now... the comment in the readme suggest there's now suddenly an issue with 2014 and the GeForce Optimus technology... If it's Optimus then aren't the newer Quadros also potentially effected too, as I think they're now Optimus too... Or, is it a GeForce, or driver, issue and thus is there a potential that issues may arrise with desktop GeForce cards/drivers?
Would love some explanation and further details for those comments in the readme... And, also thought I'd bring these comments to the attention of others, for any hardware considerations.
Sam M.
Inventor and Showcase monkey
Please mark this response as "Accept as Solution" if it answers your question...
If you have found any post to be helpful, even if it's not a direct solution, then please provide that author kudos - spread that love 😄
Solved! Go to Solution.
Solved by ChrisMitchell01. Go to Solution.
i have a dell laptop w/ Optimus disabled with the newer Quadro cards and Inventor 2014 no performance problems so far
i also have a dell desktop with a geforce card using latest nvidia driver i haven't run into any performance problems there either
DarrenP
Did you find this post helpful? Feel free to Like this post.
Did your question get successfully answered? Then click on the ACCEPT SOLUTION button.
I have been using the 2014 beta on a Dell XPS laptop which has a Nvidia card installed and encountered no problems.
It would be interesting for some Autodesk clarification on the matter though.
Mike
Thanks for bringing that up Sam, we use GeForce cards as well. Maybe it's a good thing that we held on to our Quadro FX cards... just in case.
AutoCAD 2013; AuoCAD Electrical 2013; IV 2013 Professional: Tube and Pipe, Frame Generator; Vault Professional 2013
Win 7 x64, Dell Precision PWS690, GeForce GTX580
Chris Benner
Inventor Tube & Pipe, Vault Professional
Cad Tips Tricks & Workarounds | Twitter | LinkedIn
Autodesk University Classes:
Going With The Flow with Inventor Tube and Pipe | Increasing The Volume with Inventor Tube and Pipe | Power of the Autodesk Community | Getting to Know You | Inventor Styles & Standards |Managing Properties with Vault Professional | Vault Configuration | Vault - What is it & Why Do I Need It? | A Little Less Talk - Tube & Pipe Demo | Change Orders & Revisions - Vault, Inventor & AutoCAD | Authoring & Publishing Custom Content
@DarrenP wrote:i have a dell laptop w/ Optimus disabled with the newer Quadro cards and Inventor 2014 no performance problems so far
i also have a dell desktop with a geforce card using latest nvidia driver i haven't run into any performance problems there either
How have you disabled the Optimus tech? I've got a XPS laptop and never been able to lock it out completely.
The Optimus tech pdf from Nvidia: http://www.nvidia.com/attach/3039887.html?type=support&primitive=0 states:
Using NVIDIA‟s Optimus technology, when the discrete GPU is handling all the rendering duties, the final image output to the display is still handled by the Intel integrated graphics processor (IGP). In effect, the IGP is only being used as a simple display controller
So, as I read that, there's no way to disable the i5/i7 gpu and force it to use the GeForce/Quadro 100% as it still requires the integrated gpu to act as the middle-man between the GeForce/Quadro and the display.
Sam M.
Inventor and Showcase monkey
Please mark this response as "Accept as Solution" if it answers your question...
If you have found any post to be helpful, even if it's not a direct solution, then please provide that author kudos - spread that love 😄
i went into the bios to disable optimus
then uninstalled all the intel graphics drivers
DarrenP
Did you find this post helpful? Feel free to Like this post.
Did your question get successfully answered? Then click on the ACCEPT SOLUTION button.
Come on AD - 2014 is starting to ship and people are now starting to install it. Can you provide some clarification to the points raised in the readme regarding graphics cards?
What version of Optimus is problematic, and how? GeForce 6xx better/worse than the 5xx family? newer drivers/bios help?
How vague is "with certain ATI/AMD graphics cards" that's anything from 1 ATI card to all... Surely there's a list of tested cards and thus a narrowing of this statement?
Do these problems exist in 2013 - ie, if a user is happy with 2013's performance then 2014 will be similar?
How are the "expert elite" meant to be able to support other users if we are unable to get information to provide this help? Surely there's a wealth of evidence and information to support these statements, if they're serious enough to go in the launch readme.
Sam M.
Inventor and Showcase monkey
Please mark this response as "Accept as Solution" if it answers your question...
If you have found any post to be helpful, even if it's not a direct solution, then please provide that author kudos - spread that love 😄
Thats "lawyer" type CYA stuff.
They have already stated years ago they don't "test/certifiy" graphics cards anymore. They have probably had a few issues with ATI cards and just throw that in there to cover them. There have been blanket statements about both Nvidia and ATI over the years.. Thats just computer/software life.
Hence the push for Software in the "Cloud".. Many of your hardware support issues disappear that way.
Not to mention I'm fairly sure all the benchmarking posts on here shows Nvidia outperforms ATI across the board time and time again.
now, I appreciate all that, but if it was just "get out of jail" lawyer rubbish then surely it would have been present in the 2013 (and earlier) text? And/or they would have just left it as the usual "AD recommend the use of certified gpu's" instead of these specific comments about ATI and Optimus cards.
this is something new for 2014... Either out of problems discovered during the 2014 beta (and thus specific to 2014 and my desire to know more details) or historic problems found in 2013 (or earlier) and unchanged in 2014. Thus my question that "if we're happy with gaming cards on 2013, is it the same?"
Thus my concern that those of us with DirectX gaming cards, which have been generally problem-free (well, no more problematic than workstation cards) with all recent releases. I know some users have had frustration with anti-aliasing with the 6xx series of GeForce cards, but (to me) that's not a functional show stopper like repeated crashing, lock-ups, etc - so i'm just trying to get an idea of the history, justification and severity of these comments (to help all of us understand the hardware requirements and limitations with the new release).
Sam M.
Inventor and Showcase monkey
Please mark this response as "Accept as Solution" if it answers your question...
If you have found any post to be helpful, even if it's not a direct solution, then please provide that author kudos - spread that love 😄
I use a GT690 series card at work on my new machine and a GTX590 with out any problems until I did a test yesterday with "bleed-through" using the "ambient shadows" setting. Not sure if it's a IV2014 issue or a Windows Direct X issue. The poster was having a problem with a AMD card and it shows up on my Nvidia card as well.
There are two items discussed in this thread.
1. AMD/ATI GPU issue with Quality setting. This affects a wide variety of AMD GPU HW and is caused by a GPU driver issue with Multi-Sample Anti-Aliasing technology (MSAA). The GPU is incorrectly handling the updates when MSAA is in use and that is an important characteristic which separates Quality from Performance. We are working with AMD on the problem and it should be resolved in a GPU driver update.
2. Optimus laptops: This does not affect every laptop but for those who see the problem it is very frustrating. The problem is that Autodesk products on the problem GeForce laptops will not be able to use the GeFoce GPU HW. It only affects nVidia GeForce GPU HW, not Quadro GPU HW. The issue is not new, it is essentially what was reported here a year ago and as you can see, it is not unique to Inventor but affects other Autodesk products such as Revit as well. This is an nVidia GeForce GPU driver issue and nVidia is aware of the problem. They are working on a solution but it is complicated by the fact that not every GeForce laptop exhibits the problem.
Thanks,
Chris
Chris,
thanks for the reply and the information - it's very helpful to get an understanding of the issues.
Out of interest - could ATI users disable the MSAA in their control panel be still work in Quality? just to give options for those users. Would this still provide some anti-aliasing through a non MSAA routine? (I don't use an ATI card so don't even know if there's a MSAA toggle in its drivers).
As for the Optimus technology... as far as i understand it's a mixed bag - good in theory but quite bad in practise with a lot of programs and games unable to get the laptop to change over the higher-powered Nvidea card. It's good to see that Quadro laptop cards have the option to disable it, but I'm yet to hear of a GeForce laptop with that capability. Tbh, I thought this issue was fixed as my Dell XPS laptop used to refuse to change to the GeForce card but after updating drivers it seemed to be ok (not tried it with Inventor 2013 or 2014 tho). As you said, it's a laptop to laptop issue (which makes suggesting compatable hardware difficult - especially to students who would be looking at a cheaper "gaming" laptop anyway) 😞
Sam M.
Inventor and Showcase monkey
Please mark this response as "Accept as Solution" if it answers your question...
If you have found any post to be helpful, even if it's not a direct solution, then please provide that author kudos - spread that love 😄
MSAA is controlled by the Direct3D API calls. This is not under the control of the AMD GPU Driver when using Direct3D 11.
Norbert
Norbert, thanks for the reply. You've been a great help in the past understanding the pros/cons of moving from OpenGL to DX, however many moons ago that was... (and why don't you have an AD tag on your name?)
Is 2014 now DX11? That's a hidden change that I've not seen touted (I thought it was still DX10.1).
Dare I ask - what has this brought to the table (alongside a strictor GPU requirement)? Does this mean the graphics system has had a complete rework?
Does this mean there's GPU processor assistance possible (GPGPU if that's the correct phrase?) e.g. CUDA - I thought DX11 had support for that (DirectCompute?). I know most of Inventor probably wouldn't benefit from this, but could rendering, FEA, etc make use of the GPU for processing assistance?
Sam M.
Inventor and Showcase monkey
Please mark this response as "Accept as Solution" if it answers your question...
If you have found any post to be helpful, even if it's not a direct solution, then please provide that author kudos - spread that love 😄
I should now be tagged ... a bit of work on my side, a bit of work from our IT department.
Inventor 2014 is DX11. We will use DX9 if we have to ... because your GPU does not support DX10 or higher. Note that DX11 has the ability to work with older GPUs so you can drive a DX10.0 GPU from DX11 ... and we do.
The Graphics system has not had a complete rework. It is part of our on-going work to stay current with GPU and advanced visualization technology. DX11 is the current level of GPU capabilities and we support that as our primary interface for full functionality.
Inventor does not use DirectCompute or other GPGPU technology such as CUDA in our Graphics pipeline at present. It is possible we may take advantage of DirectCompute in the future in situations where it provides sufficient benefit for the Graphics pipeline. Using a non-Graphics GPGPU Compute language such as CUDA or OpenCL in a Graphics pipeline is not ideal because of memory sharing issues so unless you are really doing Compute outside of the Graphics pipeline, using DirectCompute is usually much better because it is well integrated with the DirectX Graphics API.
On the subject of graphics, can someone explain what
"The graphics system now uses multiple cores to provide the best possible performance. "
means? Are we talking CPU or GPUs? Does Inventor support multiple graphics cards?
It means CPUs. The GPU is driven from your CPUs and traditionally, there is one CPU which "talks" to the GPU. DirectX 11 added the ability to drive the GPU from multiple threads. There is still only one thread which is actively drawing using the GPU (at a time) but multiple threads can be preparing information for the "drawing" thread to keep it fully occupied "drawing" instead of needing to pause and do other API call processing which involves the GPU (e.g. preparing textures, geometry, etc. to be drawn). The "additional" threads can do that work while the "drawing" thread is now busy drawing all the time (in a fully optimized SW environment).
In additoin to driving the GPU, there are other SW aspects to the Graphics Layer pipeline which do not directly involve the GPU and the Inventor Graphics Layer code has been enhanced to use multiple threads (and therefore, multiple CPU cores) to do that processing.
Multiple GPUs has very limited benefit unless you need to drive large numbers of display monitors. If you need to display on two monitors, there is very little if any benefit to having multiple GPUs for Graphics. A second GPU can be useful if you are using it for "compute" such as using our Moldflow solvers which can run on nVidia GPUs using CUDA technology from nVidia.
Or 3DMax which will allow you to select which CPU and GPU's you want to use.
@NorbertGraphics wrote:
Inventor 2014 is DX11. We will use DX9 if we have to ... because your GPU does not support DX10 or higher. Note that DX11 has the ability to work with older GPUs so you can drive a DX10.0 GPU from DX11 ... and we do.
So, am i right in understanding that if you have a DX9 card (or Win XP) you run in DX9 mode and DX10 + DX11 cards run in DX11 (but presumably some stepped-down mode for the DX10 cards in DX11)? I didn't realize DX11 worked with DX10 cards.
To ask the $64 question - what does DX11 bring to the table? and what is lost when using a DX10 (or 10.1) card? Just thinking that a few people are still on GTX260s (or equivalents) which are only 10.1.
should the enhanced visualization help page now be updated to reflect DX11?
Or is that still the same, but with an additional step(s) for the DX11 features?
Sam M.
Inventor and Showcase monkey
Please mark this response as "Accept as Solution" if it answers your question...
If you have found any post to be helpful, even if it's not a direct solution, then please provide that author kudos - spread that love 😄