Community
Fusion Design, Validate & Document
Stuck on a workflow? Have a tricky question about a Fusion (formerly Fusion 360) feature? Share your project, tips and tricks, ask questions, and get advice from the community.
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Thermal Simulation - LED Array

1 REPLY 1
SOLVED
Reply
Message 1 of 2
alexbodell
1556 Views, 1 Reply

Thermal Simulation - LED Array

Hi, very excited to have the thermal simulation function added to the Fusion360 package! I've done limited thermal simulation in the past, so I appreciate any and all feedback + suggestions.  

 

I've been searching the forum and have found a few helpful references: 

1 - "simulating led array and heat sink workflow ?"

2 - "Trying out thermal simulation"

3- "CP12977: Simulation for Fusion 360"

 

Building off these references, I'd like to get suggestions for setting up the following simulation (I'm attempting to optiize a heatsink for an LED fixture).  

 

System:

- I know the LED MFG, package size (5mmx5mm), # / spacing of LEDs, and power level   

- In production, the LEDs will be soldered to a 2mm thick aluminum core PCB

- Thermal adhesive will attach the PCB to the heatsink 

- The heatsink will not have a finish (anodize, powder coat, etc) 

- I have published values for the thermal resistance at each junction 

 

I'd like to simulate 1 fixture suspended in open space.  The ambient temperature is 28C.

The fixture assembly includes a heatsink and pcb (for my iniital simulation I did not include the individual LEDs).  

 

Referencing the "Simulation for Fusion360" video, I assigned an "internal heat" value of 50 watts to the PCB (I have an existing sample that I used to measure the total power consumption of the LEDs).

 

I then assigned a convection load to the entire fixture (PCB + Heatisnk) @ 28C 

Finally I assigned a convection load just to the heatsink, deselcted the maiting face on the heatsink that attaches to the PCB

 

For both convection loads, I started with a convection value of 30W/(m2C).  This returend a maximum temp value well below the value I measured @ 28C using an existing prototype sample. 

 

Adjusting both convection values, I found that a value of 3.5W/m2C returned a max temp value of 51C, which is very close to what I measured. 

 

So, my first question is: how do I select a proper convection value? Based on the ""Trying out thermal simulation" reference, free air convection values fall within a range from 2W/m2C - 25W/m2C.  Am I correct to adjust this value to match my testing, then keep these values for future design iterations? 

 

What about conduction and asigning thermal resistnace values between the PCB and Heatsink? The "Simulation for Fusion360" did not reference or include this.  

 

For basic design optimization, is it appropreite to run the thermal simulation with an Internal Heat applied to the PCB (knowing the power dissapated by the LES)? Or should I be including the individual LEDs and applying Internal Heat to them? 

 

 

 

 

 

 

 

1 REPLY 1
Message 2 of 2
jdalidd
in reply to: alexbodell

Hi alexbodell,

 

For free convection, the original value of 30W/(m2C) seems quite high. A value that high would more likely be for forced convection (i.e., if you had a small fan on the heatsink). So, assuming free convection here is a good reference:

 

http://www.electronics-cooling.com/2001/08/simplified-formula-for-estimating-natural-convection-heat...

 

You can see on the chart that for typical temperatures of 50 deg C, the convection coefficient is between 2-6W/(m2C) depending upon the orientation (orientation makes a difference since free convection is driven by the buoyancy of the air). So a value of 3.5W/(m2C) seems very reasonable. Since that matches your measured data it should be accurate for other test cases, provided the general shape of the heat sink and configuration stays the same and the temperatures are similar. Since the convection coefficient is dependent upon the surface temperature, you may need to make adjustments if you have configurations that are significantly hotter or cooler than 50C.

 

For your other question:

What about conduction and asigning thermal resistnace values between the PCB and Heatsink?

 

This really depends upon the thermal gradient through your assembly. If you are seeing a large difference in temperature between the PCB and Heatsink for your physical test (i.e., measured values), then more accurately modeling in the thermal resistance may help. But if the steady state temperature profile is mostly uniform, then it probably isn't necessary to include these effects.

 

For your last question:

For basic design optimization, is it appropreite to run the thermal simulation with an Internal Heat applied to the PCB (knowing the power dissapated by the LES)? Or should I be including the individual LEDs and applying Internal Heat to them? 

 

I think applying the heat directly to the PCB is a valid simplification for basic design optimization. One other thing to keep in mind is some of the power consumed by the LEDs will get converted into light. With increasing efficiencies of modern LEDs this can actually be significant. The data sheet on the LED (i.e., Cree, etc) should have data on the luminous efficacy.

 

 

 




Jonas Dalidd

Can't find what you're looking for? Ask the community or share your knowledge.

Post to forums  

Autodesk Design & Make Report