Tag Archives: simulation

Brave New World

OLYMPUS DIGITAL CAMERATerm has started, and our students are preparing for end-of-semester examinations; so, I suspect that they would welcome the opportunity to deploy the sleeping-learning that Aldous Huxley envisaged in his ‘Brave New World’ of 2540.  In the brave new world of digital engineering, some engineers are attempting to conceive of a world in which experiments have become obsolete because we can rely on computational modelling to simulate engineering systems.  This ambitious goal is a driver for the MOTIVATE project [see my post entitled ‘Getting smarter‘ on June 21st, 2017]; an EU-project that kicked-off about six months ago and was the subject of a brainstorming session in the Red Deer in Sheffield last September [see my post entitled ‘Anything other than lager, stout or porter!‘ on September 6th, 2017.  The project has its own website now at www.engineeringvalidation.org

A world without experiments is almost unimaginable for engineers whose education and training is deeply rooted in empiricism, which is the philosophical approach that requires assumptions, models and theories to be tested against observations from the real-world before they can be accepted.  In the MOTIVATE project, we are thinking about ways in which fewer experiments can provide more and better measured data for the validation of computational models of engineering systems.   In December, under the auspices of the project, experts from academia, industry and national labs from across Europe met near Bristol and debated how to reshape the traditional flow-chart used in the validation of engineering models, which places equal weight on experiments and computational models [see ASME V&V 10-2006 Figure 2].  In a smaller follow-up meeting in Zurich, just before Christmas [see my post ‘A reflection of existentialism‘ on December 20th, 2017], we blended the ideas from the Bristol session into a new flow-chart that could lead to the validation of some engineering systems without conducting experiments in parallel.  This is not perhaps as radical as it sounds because this happens already for some evolutionary designs, especially if they are not safety-critical.  Nevertheless, if we are to achieve the paradigm shift towards the new digital world, then we will have to convince the wider engineering community about our novel approach through demonstrations of its successful application, which sounds like empiricism again!  More on that in future updates.

Image by Erwin Hack: Coffee and pastries awaiting technical experts debating behind the closed door.

Advertisements

Getting smarter

A350 XWB passes Maximum Wing Bending test [from: http://www.airbus.com/galleries/photo-gallery%5D

Garbage in, garbage out (GIGO) is a perennial problem in computational simulations of engineering structures.  If the description of the geometry of the structure, the material behaviour, the loading conditions or the boundary conditions are incorrect (garbage in), then the simulation generates predictions that are wrong (garbage out), or least an unreliable representation of reality.  It is not easy to describe precisely the geometry, material, loading and environment of a complex structure, such as an aircraft or a powerstation; because, the complete description is either unavailable or too complicated.  Hence, modellers make assumptions about the unknown information and, or to simplify the description.  This means the predictions from the simulation have to be tested against reality in order to establish confidence in them – a process known as model validation [see my post entitled ‘Model validation‘ on September 18th, 2012].

It is good practice to design experiments specifically to generate data for model validation but it is expensive, especially when your structure is a huge passenger aircraft.  So naturally, you would like to extract as much information from each experiment as possible and to perform as few experiments as possible, whilst both ensuring predictions are reliable and providing confidence in them.  In other words, you have to be very smart about designing and conducting the experiments as well as performing the validation process.

Together with researchers at Empa in Zurich, the Industrial Systems Institute of the Athena Research Centre in Athens and Dantec Dynamics in Ulm, I am embarking on a new EU Horizon 2020 project to try and make us smarter about experiments and validation.  The project, known as MOTIVATE [Matrix Optimization for Testing by Interaction of Virtual and Test Environments (Grant Nr. 754660)], is funded through the Clean Sky 2 Joint Undertaking with Airbus acting as our topic manager to guide us towards an outcome that will be applicable in industry.  We held our kick-off meeting in Liverpool last week, which is why it is uppermost in my mind at the moment.  We have 36-months to get smarter on an industrial scale and demonstrate it in a full-scale test on an aircraft structure.  So, some sleepness nights ahead…

Bibliography:

ASME V&V 10-2006, Guide for verification & validation in computational solid mechanics, American Society of Mech. Engineers, New York, 2006.

European Committee for Standardisation (CEN), Validation of computational solid mechanics models, CEN Workshop Agreement, CWA 16799:2014 E.

Hack E & Lampeas G (Guest Editors) & Patterson EA (Editor), Special issue on advances in validation of computational mechanics models, J. Strain Analysis, 51 (1), 2016.

http://www.engineeringvalidation.org/