I'm really contemplating purchasing a commercial license, solely for generative design capabilities, and didn't want to have to pay for cloud credits IN ADDITION to paying for the license. Will generative ever get a local compute feature so that I can utilize my workstation and not have to pay for cloud credits? I currently don't have commercial because I'm a student.
Solved! Go to Solution.
@mavigogun wrote:
@mvassilevJSN2W wrote:It's just odd that it is the only way to use it which leaves me with a sour taste in my mouth.
It's not odd at all, but in keeping with Fusion modus operandi of progressively adding features. The GPU cluster version can't just be ported over for local use after a couple simple edits- it will have to be rebuilt. So, it gets prioritized like everything else. I reckon, given there is a functional option, "when" becomes later rather than sooner.
Based on what I've read and seen on this forum and others, they don't have plans for doing this. And I think they are just banking on the cloud system and leaving it at that. I'm glad threads like these are picking up steam so that a local feature option gets put on the map for consideration. The only way is for us to show enough intensest in a feature like this.
I_Forge_KC, Thank you for the informative description.
I came into this conversation frustrated, because as a student I have used Altair's topology optimization solutions and was able to generate a design for the internal supports of an aircraft wing in about 30 mins on my local computer (nice computer, gaming oriented). I think that a big frustration to users is that they assume a local solver of ADG would be similar in compute time, and therefore completely acceptable to use. It seems like you are saying that the ADG is a different computational platform that uses much more processing power than the SIMP solvers that other companies use. Therefore the ADG solver would not work locally. Am I summarizing your thoughts correctly?
I also learned from my experience with Altair that I messed up, a lot. I had to tweak the setup multiple times to get the program to understand what I really meant.
I think that's the frustration of the cloud credit method, it punishes you for not knowing how to perfectly set it up.
I think a better method of implementing the cloud computing would be to allow users to generate a design for free, but only be able to preview the model. From this preview they could see if they made a mistake in their setup. This would also be a great way to let new users learn the software.
If you liked what was generated, then you buy the model with cloud credits and get to add the solid model to your library.
The risk of this to Autodesk is that they are paying to have you do each computation. I think a free computation and then a price to buy could be set up to balance this out. For example, if the average user takes 5 simulations to get a model they are satisfied with, you would set your pricing to be: simulation is free, a model costs 125 credits. This would allow new users to not feel punished for learning, and would be able to get up to speed faster. The burden is put on companies who have experienced engineers who take less simulations to get the design they want. But those companies are probably better established and can shoulder the cost easier than a hobbyist.
Autodesk could easily build in a limit to the number of simulations run without a user buying a model to prevent people from only simulating and never buying a model. For example, if a user simulates more than 10 times per month but never buys a model, Autodesk cuts off their simulation or starts charging them for the simulation.
I'm just throwing out ideas that I like. I'm not sure if this would actually be a good business model for Autodesk. I do think it would allow a lot more users to get to know and use the AGD part of Fusion360 and therefore increase the use of it overall.
Edit:
Upon further inspection, it looks like Autodesk already has the model, it's 25 credits to simulate and 100 credits to buy the model. (I thought it was just 25 to simulate and then no credits to purchase). So they do balance it out a bit. It still stops people, like me, from learning on their own, though.
Also, the class I took was with Dr. Robert Yancey, who I just found out is the Director of Manufacturing and Production Business Strategy at Autodesk and is one of the generative design experts. (when I was in school he was with a different company). I would be interested to hear his perspective.
Generative Adversarial AI dev here...You are very far off in your estimations of what this costs to run. This can be run on a local workstation without some massive cost to it. I develop and test applications like this on a workstation that cost around $1200. The hardware that this is computed on is currently a GPU not a CPU. Eventually NVIDIA will complete a ASIC for this type of computation, but until then...any low end NVIDIA GPU will be capable of calculating this within a couple days. A reasonably priced GPU ($200-$400) would calculate this between 2-8 hours. Autodesk is marking up the price here significantly and for good reason...there aren't a lot of generative design applications out there for 3D at the moment. I highly doubt they will enable local compute for this type of service, that doesn't jive with the business model of any software company looking forward. Everyone (Software companies) wants cloud-based, subscription type cost. For Autodesk, this is a zero risk income. They can estimate almost exactly what this will cost to reach x-number of epochs and output your model.
Can't find what you're looking for? Ask the community or share your knowledge.