I ran a few fill studies regarding the effect of the gate diameter using a hot runner.
Can you help me to clarify why actually the injection time (and injection pressure) did not change despite the gate diameter was modified?
"Fill time" was set to AUTOMATIC in the Process Settings dialog, but I would have expected that the response fill time/injection pressure will vary but results show no difference. Shear rate drops, thats clear.
Diameter of sprue-, runner-, drops was set to 7mm, I altered the bottom gate diameter in the runner wizard and also set the gate contact diameter in the solver options (not sure if that was really needed - it was set to automatic by default, but just wanted to make sure I don`t miss anything to set).
Would you please take a look at the attachments.
Many thanks for your answers!
It appears that the automatic fill time calculation is based on the part model volume and not the gate size. Or atleast the part model has a bigger influence on that calculation than the gate size, which i feel is understandable.
Personally, i would not rely on the automatic fill time setting to optimize the fill time for every change to the model/gate. It gives you a good reference point but manually altering it to achieve more suitable results is recommended.
I usually dont give this much thought for the parts that i deal with but would certainly like to hear from others.
thanks for your response.
I can understand your argument. Yes, you can manually alter the inj.time to have better and better results at the end, but the purpose of my study was to determine the effect of gate diameter.
As far as I know, setting the fill time to automatic is the only way to determine the fill time at the end.
This is where my problem lies. Fill time results that I have don`t provide me reliable approximation on the future fill times (as values are practically constant), despite I adjusted the gate diameter in the solver, which was set in the runner wizard too.
Is there a way to find a resolution for this matter?
thanks a lot
Let me ask you this; Why do you think altering the gate size should change the fill time?
You want the part to be filled optimally and then you want to size to gate to suit that fill time. You dont want to design the gate first and then select a fill time based on your gate design, as that may not be optimal for the part.
If the part volume to be filled is the same and there is a certain optimal throughput that achieves the desired pressure and temperature profile in the part then the gate should be sized according to that optimal throughput. The only thing you do by altering the gate size is have better control over the desired shear and related properties.
As a result i dont see why should the fill time change if the part volume does not change (atleast for the automatic setting).
To summarize my approach is:
Find the optimal fill time for the part only (without the gate)
Design/Size the gate based on that optimal fill time (throughput)
Re-run with the gate size implemented (at this point, i dont see the need to alter the gate size significantly). May be an iteration or two just to fine tune, if necessary.
Hope you agree.
Thanks for your comments.
My assumption was the following. If the volume of the cavity remains the same, then increased gate diameter would help to achieve faster filling as more material can flow through the runner in a given amount of time.
Also, I had expected that the injection pressure would drop, again, with increased gate diameter.
If the injection time was the same, then the injection pressure should have decreased (or, if the injection pressure was the same, then the injection time should have dropped).
When I looked at my previous results (first attachment), none of the two had changed.
I did other four runs (fill) on a simple geometry (attached), using no runner only setting gate diameter in the solver options. Contrary to the results I posted previously, at least now I can see that the injection pressure dropped dramatically, but at the last two diameters the injection time slightly increased... (question of geometry???)
Anyway, thanks for the design tips, I`ll keep in mind.
Not to discourage you but i dont think your trial study is an apples-to-apples comparison. You have simplified the geometry such that the fill pressure are extremely low. So %wise the differences look drastic, especially over a small set of entities (i.e. part-only). I'd rather look at absolute changes in values then you see it is only a 0.8 MPA improvement.
I think if you added a runner to your trial setup and perform the same study. You'll notice that the drastic pressure drop will not be as drastic %wise. But i expect it will be fairly close based on numerical differences.
ON A SEPERATE NOTE: I did find something in moldflow help about automatic fill time calculation:
If you set the injection time to Automatic, the analysis finds the injection time which gives the lowest injection pressure. The following graph shows the results from nine analyses on the same part. The blue points represent the analyses where the injection time was set to a particular value. The red point represents the analysis where the Automatic injection time check box was selected, which shows the lowest possible injection pressure for the part.
SORRY FOR NOT INCLUDING THE IMAGE.
Figure 1. Injection pressure as a function of time
The variation of injection pressure against injection time has two influences. Firstly, as the injection time increases from zero, the pressure required to force the molten plastic through the part decreases. Secondly, as the injection time increases, the polymer temperature decreases due to heat transfer to the mold, which causes the viscosity and frozen layer thickness to increases, which in turn increases the injection pressure.
After we had a few conversations, the problem regarding the fill time was still bothering me, especially when you asked me why I had thought that fill time was affected by the gate dioameter...
I was just thinking about an example, when you want to fill a cup with water. In the first case one opens the tap slightly, while in the second case, the tap`s fully open. Regardless of the viscosity, faster fill should be achieved with the second case.
So I created three cubes in cad software, each 40x40x40mm, first one having 2.5mm channel, the second has 5mm and the third has 7.5mm respectively.
The key thing is that I didn`t create the runners in MoldFlow, but drew the cubes plus extruded circles on top of it exactly at the center and later declared as hot runners prior starting the simulation. The length of the channel was also 40mm.
All settings were default, 3d mesh was used.
Please have a look at my recent result regarding the fill time (attached too).
This is what I expected from the beginning. And this is what I missed very much from my previous results when I changed the runner geometry in MoldFlow plus set the gate contact diameters in the solver options.
Doubling the channel diameter (consequently the diameter of the gate) does not only help to control the shear but to reduce the fill time.
2.5 mm -- > ~ 20 sec.
5.0 mm -- > ~ 15 sec.
7.5 mm -- > ~ 9 sec.
According to the past results - MoldFlow did not (or unable to?) take account properly my custom created runner systems. It seems that it only works when runners are the true parts of the part, like in this present case.
At least it`s good to see now that results show conformity to reality.