Quote:
Originally Posted by serialk11r
I talked to a MechE PhD student and he told me CFD actually has a long way to go because the algorithms are just too slow (which means, to my surprise as a mathematician, that there is a very very important practical use for PDE research!). Evolutionary algorithms would not be hard to implement for this purpose because you could just have the program randomly add bits and pieces to the shape and see where that goes but since you're running the CFD computations millions of times over you'd need to have a better algorithm for doing that.
On the bright side, the current best algorithms are like O(e^(e^(e^n))) or something hideous like that so cutting it down to say a double-exponential would make this kind of thing feasible on a supercomputer.
|
This student that you talked to,is he working with the full Navier-Stokes equation for 3-d flow?
I understand it to be THE numerical tool for analyzing 3-D bluff body flow.
There is a Kappa-Epsilon turbulence model ancillary software which is dovetailed into FLUENT or something else which gets the Reynolds numbers/boundary layer conditions correct.
And I understand that it does take a supercomputer to take this on,but what if,say,we start with the Ahmed body,run a change on it,then make that data available to the next (and any) investigator who then runs their change,each iterations data available to any interested investigators,so there's never any duplication of work.Parallel processing as is being done with astronomy?
Seems like you could dial in a form for verification in a tunnel with a high certainty of performance beforehand.
Just thinking out loud.