So, all models are wrong, to some extent, due to various reasons. For electronic thermal simulation the main suspect is power as described in Part I. Weighing in at anything up to +/- 20% (on a good day) it is the main reason for model inaccuracy. Unlike power, the next in line is much more under your control as a CFD modeller. It is grid.
Sometimes referred to a ‘mesh’, a grid is a subdivision of the 3D volume that is to be studied into many small tessellated volumes, more mathematically ‘control volumes’, more commonly ‘grid cells’. The governing partial differential equations are then integrated over each volume by the CFD code, lots of very clever highly mathematical jiggery-pokery (well, how would you spell it?) then ensues. Fianlly out the other end of a long wait pops values of temperature in each and every one of these grid cells (plus pressure and velocity in the cells not covered by solid material). The more grid cells you have the longer you have to wait. For FloTHERM if you have a model of say 10,000 cells you wait seconds. If you have a model of 5 million cells you wait hours (normally you sleep during that time, oft refered to as ‘night’). More worryingly if you have too few cells your simulation accuracy goes down. O no, not one of these trade off thingies, accuracy for waiting time. The latter is easily judged, the former itsn’t. How’s that for modeller responsibility.
A grid is form of resolution. Resolution of the geometry you are representing AND resolution of the various gradients of temperature and air pressure, speed etc. that occur within your electronics. Actually, you could argue that it’s all a gradient capture issue, geometry interface gradients are just very very sharp, that’s all.
“How much grid do I need?”
“Enough to resolve what’s going on”
“Jeesh, why are you guys always so obtuse?”
There’s a lot of fluid and heat flow stuff going on in an electronics box. Depending on what it is you want to find out from the solution some things will be important for the simulation, others will not. For standard ‘what is the junction temperature of my component’ type simulations you have to resolve all the thermal resistances between the heat source and the cooling ambient. Soemtimes that takes a lot of grid, other times less. Take the following picture (reach for the lasers…)
Each ‘pixel’ is a grid cell. You don’t need many pixels to resolve the smiley. Take the same grid densities and apply it to a tump of lovely cider apples:
You need a really fine mesh to distinguish the individual apples. We always advise doing what is called a ‘grid sensitivity’ test, progressively refine the mesh, solve, note down the value of the thing you’re interested in, refine, repeat until such time as making the grid finer will NOT result in a change in the parameter you’re interested in. THEN you’ll have a mesh with the best balance between accuracy and solution speed. When people are first taught this I’m sure they’re so busy digesting this concept that they only later realise that they should have said:
“Hey, hold on, you telling me I’ve got to do lots of simulations to find out which one out of them all would be the fastest yet still accurate, that’ll take AGES!!??”
Fair point. After a while you gain experience in where the grid needs to be fine, where it doesn’t. Get it wrong and you’ll be pleased in getting to look at your pretty CFD pictures within minutes but maybe not so pleased in realising that the dT errors may be of the order of 50-100%. How to make sure that doesn’t happen? At some stage compare with experimental data, that’s a great modelling educational aid. Allocate some time to do a range of grid independency tests on your typical applications. An F1 driver will test drive his car, learn from it, know how and when to push it hard, all before taking it out in a race. Do the same with your CFD code.
Some pointers: Put a fine grid in high gradient areas, typically in the PCB around a critical component, especially in natural convection or conduction cooled environments. Generally put a fine mesh in the air next to surfaces through which a fair amount of the heat flows, e.g. heatsink fins, PCB surfaces, tops of components, sides of a sealed enclosure.
OK, this sounds clear enough
“You CFD software vendor guys, just come up with an automatic method of determining the best balance between additional grid cells for accuracy resolution but not so many as will make me wait unduly for my solve to finish”
Believe me, we’d love to. Modern CFD’s only about 40 years old, still evolving in leaps and bounds. One area that’s certainly going to come to the fore is solution adaptive meshing. In this method the mesh refines itself automatically during the solution to resolve gradients of a solution variable, putting finer cells where say the dT/dXi gradients are large, taking it away from where they are small (less cells for smiley, more cells for tumpy):
This is already in our FloEFD suite of products. Sweet. The glittering future of CFD will I’m sure evolve this concept so as to adapt the mesh not purely to resolve local gradients, but to ensure that the reasons or the goals of the simulation themselves are resolved accurately with minimal mesh.
For now gridding is as much an art as it is a science. The art is in achieving a good enough resolution whilst minimising the solution time. It ain’t called ‘modelling’ for nothing you know
Ross-on-Wye, 28th May 2009