Robustness
2021
-
An Integrated Modeling Scheme for Sensor Embedded Woven Composite Structures in Manufacturing Simulation
Tolga Usta, Christian Liebold, DYNAmore GmbH, Stuttgart, Germany, Mathieu Vinot, Institute of Structures and Design, German Aerospace Center, Stuttgart, Germany
2020
-
Sequential Optimization & Probabilistic Analysis Unsing Adaptively Refined Constraints in LS-OPT
Anirban Basudhar, Livermore Software Technology LLC, Ansys Group, USA, Katharina Witowski, DYNAmore GmbH, Germany, Imtiaz Gandikota, Livermore Software Technology LLS, Ansys Group, USA
2019
-
Adaptive Sampling Using LS-OPT
Anirban Basudhar, Livermore Software Technology Corporation, Livermore, CA
2018
-
A Study on Scatter during Production Process using Statistical Approach using LS-OPT®
Masahiro Okamura, JSOL Corporation
In recent years, robustness of car body structure has become more important than ever, as car manufacturers are required to achieve conflicting performance in high level such as light weight and crash performance. Major source of scatter in body structures are material scatter and production scatter. Since it is difficult to reduce scatter in material as certain range of scatter is allowed by industrial standard, tightening the quality control ...
2017
-
Improvement of Response Surface Quality for Full Car Frontal Crash Simulations by suppressing Bifurcation using Statistical Approach
Masahiro Okamura (JSOL Corporation)
In recent years, importance of optimization is rising in automotive industry, since needs in fulfilling conflicting requirement such as light weight, rigidity, and safety in high level are continuously increasing, while car structure becomes complex due to new material and new connection techniques. RSM (Response Surface Method) is one of key technology for the purpose, and various approaches have been made. However, quality of response surfaces tend to be poor when it comes to frontal or rear crash where contact and buckling is dominant, since bifurcations in behavior bring high non-linearity to response surfaces. One measure is to increase the number of simulation runs in order to improve the accuracy of response surface, but as the size of full car simulation models becomes bigger, it is not realistic to run over 100 times. The fundamental problem is that the response surface is with high complexity due to bifurcations such as buckling and contact so that trying to fit highly non-linear response surface by adding points is not the absolute solution, but to reduce non-linearity of the surface in order to make it easy to fit. In this study, scatter propagation mechanism is visualized based on statistical calculations, and structural design of front structure of an automobile is enhanced in order to suppress bifurcations with help from a statistical analysis software DIFFCRASH. Triggers of bifurcation are located and mechanisms of the bifurcations are studied, and design modifications are made to stabilize the deformation modes. As a result, the complexity of response surface has been reduced, and accuracy of the response surface has been improved.
-
Combined Analysis of LS-DYNA Crash-Simulations and Crash-Test Scans
Dominik Borsotto, Lennart Jansen, Clemens-August Thole (SIDACT GmbH)
In robustness campaigns and optimization processes metamodels are created out of a set of crash- simulations. With the help of such analyses the models used for the simulations can be improved. For example, instabilities can be found and explained or the needed material can be minimized under certain safety restrictions. An important question in this context is: How good can these metamodels represent the reality? To answer this question, one can compare the crash-simulations to the real crash-tests, which were recorded by camera systems after the crash. To be able to compare the test-data with the LS-DYNA crash-simulations, we first need to convert the test-data by matching the geometries and transferring the part information from the simulation to the crash-test. Afterwards one can calculate the combination of the simulations, which approximates geometry and deformation behavior of the test- data as close as possible. The distance and difference in behavior between this calculated Best Fit and the actual crash-test can be used to measure the quality of the simulation model. Once the evaluation of the model is finished, the test-data can also be added to a robustness campaign as an additional simulation and used for further analysis. This allows us to answer questions such as: How does the test fit into the simulation subspace? Which simulation runs are similar to the test for a certain crash event? Which of the dominating crash events found in the simulation can also be found in the test? Thus, the described matching procedure combined with exemplary further analysis methods on the one hand allow for a quick and automated matching between test and simulation and on the other hand a more detailed validation of the simulation model in comparison to the actual test. Due to the conversion of the test-data, the same post-processors can be used for both the simulations and the test-data, resulting in a smoother workflow.
2016
-
Use of Data Reduction Methods for Robust Optimization
Dominik Borsotto, Lennart Jansen, Robin Strickstrock, Clemens-August Thole (SIDACT GmbH)
Data reduction methods like principle component analysis, singular value decomposition and independent component analysis methods allow analyzing huge sets of data. Applied to simulation results they allow the characterization of major trends in the variation of these results. For the public Chevrolet Silverado example all thicknesses are initially varied independent of each other among a number of simulation results and its correlations are computed to the variation of the behavior of the firewall. The behavior of the firewall is approximated using data reduction methods. It turns out, that the variation of the firewall can be characterized by one basic deformation mode. The thickness variation of a part may show strong or weak correlation to the behavior of this deformation mode. In several steps now, the sensitivity analysis is repeated using only those parts for thickness variation, which had a strong correlation in the previous step. Finally it turns out, that the thicknesses of the longitudinal rails as well as certain bifurcation behavior of the longitudinal are responsible for this variation mode of the firewall.
-
Automated Generation of Robustness Knowledge for selected Crash Structures
Constantin Diez, Christian Wieser, Lothar Harzheim (Opel AG), Axel Schumacher (University of Wuppertal)
-
Process to improve optimization with combined robustness analysis results
Dominik Borsotto, Lennart Jansen, Clemens-August Thole (SIDACT GmbH)
2015
-
Classification-based Optimization and Reliability Assessment Using LS-OPT
Anirban Basudhar, Imtiaz Gandikota, Nielen Stander (LSTC), Åke Svedin, Christopher Belestam (DYNAmore Nordic, Sweden), Katharina Witowski (DYNAmore GmBH, Germany)
Simulation-based design optimization and probabilistic analysis involve repeated calls to potentially expensive (e.g. crashworthiness analysis) design alternative or function evaluations. To avoid the high cost (or computational time) associated with repeated expensive evaluations, the actual function evaluations are substituted with evaluations based on metamodels. Metamodels, trained based on relatively few actual function evaluations, provide an approximation of the system responses and thus act as surrogate models. This paper presents an alternative method, based on the classification of response values, which does not require response approximation [7, 9]. As a result, it is unaffected by the presence of binary or discontinuous responses and provides a straightforward way to solve such problems. The principal idea is to classify the training samples into two classes, using a threshold value or a clustering method, and then construct an "optimal" decision boundary in the design space that separates the samples belonging to the different classes. Thus, this decision boundary can be used as the limit-state in reliability analysis [7] or to constrain feasible design alternatives in optimization [10].
2014
-
Design Tolerance Optimization using LS-OPT
Anirban Basudhar, Nielen Stander, Imtiaz Gandikota (LSTC), Åke Svedin (DYNAmore Nordic, Sweden), Katharina Witowski (DYNAmore GmBH, Germany)
In this work, LS-OPT is used to perform tolerance-based design optimization using a multi-level scheme [6]. The outer level consists of an optimization problem that maximizes the tolerance value such that there is no failure within the tolerance interval (i.e. neligible probability of failure). The nominal design parameters and their tolerance values are the optimization variables for this level; thus, each sample in the outer level uniquely defines the nominal values and bounds of the variable probability density functions (PDFs). The probability of failure for each sample is determined using an inner level Monte Carlo analysis. This value is then extracted as an outer level response and is constrained to be close to zero during the optimization. The outer level optimization is formulated such that a balance between robustness and performance is acheved.
-
Use of Data Reduction Methods for Robust Optimization
Dominik Borsotto, Lennart Jansen, Robin Strickstrock, Clemens-August Thole (SIDACT GmbH)
Data reduction methods like principle component analysis, singular value decomposition and independent component analysis methods allow analyzing huge sets of data. Applied to simulation results they allow the characterization of major trends in the variation of these results. For the public Chevrolet Silverado example all thicknesses are initially varied independent of each other among a number of simulation results and its correlations are computed to the variation of the behavior of the firewall. The behavior of the firewall is approximated using data reduction methods. It turns out, that the variation of the firewall can be characterized by one basic deformation mode. The thickness variation of a part may show strong or weak correlation to the behavior of this deformation mode. In several steps now, the sensitivity analysis is repeated using only those parts for thickness variation, which had a strong correlation in the previous step. Finally it turns out, that the thicknesses of the longitudinal rails as well as certain bifurcation behavior of the longitudinal are responsible for this variation mode of the firewall.
2011
-
Complexity Based Design Robustness Analysis - Application to Mechatronic Component (Vehicle Hatchback)
K. Kayvantash, D. Bordet (CADLM)
The purpose of this presentation is to simplify the stochastic analysis post processing while presenting a solution which allows for fast interpretation of results via a unified system health indicator which we shall call "model complexity". Based on this model (or any other system) "robustness metric" we shall present a few examples allowing for clear demonstration of how CAD/CAE based design can be drastically imporved via minimization of the model complexity. The method and associated tools may also be used for data mining, optimization, on-board dynamic warning and health monitoring as well as numerous other applications requiring holistic ratings in virtual testing.
2010 & earlier
-
STOCHASTIC ANALYSIS OF UNCERTAINTIES FOR METAL FORMING PROCESSES WITH LS-OPT
Heiner Müllerschön (DYNAmore GmbH), Willem Roux2 (Livermore Software Technology Corporation), David Lorenz (DYNAmore GmbH), Karl Roll (Daimler AG )
The purpose of this paper is to account for uncertainties in the manufacturing processes of metal forming in order to evaluate the random variations with the aid of FE-simulations. Various parameters of the Finite-Element model describing the investigated structural model are affected by randomness. This, of course, leads...
-
New Developments in LS-OPT - Robustness Studies
Ken Craig (Multidisciplinary Design Optimization Group University of Pretoria), Willem Roux, Nielen Stander (Livermore Software Technology Corp)
Optimization of shell buckling incorporating Karhunen-Loève-based geometrical imperfections
-
BioRID-II Dummy Model Development -- Stochastic Investigations
Stahlschmidt S., Keding B., Witowski K., H. Müllerschön H., Franz U. (DYNAmore GmbH)
Whiplash injuries frequently occur in low speed rear crashes. Many consumer and insurance organizations use the BioRID-II dummy as test device, to assess the risk of whiplash injuries in car accidents. An LS-DYNA model of the BioRID-II dummy has been developed by DYNAmore GmbH in cooperation with the German Automotive Industry....
-
Robust Design in LS-OPT 3
W. Roux, N. Stander (LSTC)
-
Robustness Investigation of a Numerical Simulation of the ECE-R14 with particular regard to correlation analysis
K. Hessenberger, R. Henniger (DaimlerChrysler), H. Müllerschön (DYNAmore)
The stability of seats, seat-belts and seat-belt anchorages of vehicles are very important for the passengers’ safety during an accident. Therefore, dedicated safety tests have to be passed in order to ensure a correct functionality of those parts. These tests are defined in the European regulation ECE-R14 in detail, whereas crashing loads are substituted by appropriate pulling forces on the seat-belts. The complete configuration is designed, simulated and tested with certain (physical) parameters (geometry, materials, testing conditions, ...) which are assumed to be constant. In reality, most of these quantities may vary and take other values with appointed probabilities. Generally, different values for the design parameters will cause a variation in the simulation or testing result, which will also occur with a certain probability. Thus the results of arbitrary separate investigations are commonly not meaningful enough because their probability of occurrence is unknown. The determination of this probability is part of a robustness investigation. In this paper the Monte Carlo Analysis as a very simple method is applied to a FEA model observing ECE-R14. Additionally, the identification of each parameter’s importance for the simulation or testing result is a main topic. This is done by means of a linear correlation analysis, which is described in section 3 in detail.
-
Robust Design for Crash at DaimlerChrysler Commercial Vehicles CAE
F. Günther (DaimlerChrysler AG), H. Müllerschön (DYNAmore GmbH), W. Roux (LSTC)
-
Two-Stage Stochastic and Deterministic Optimization
T. Rzesnitzek, F. Günther, F.Wozniak (DaimlerChrysler AG), H. Müllerschön (DYNAmore)
-
Probabilistic Analysis with LS-OPT
W. Roux, N. Stander (LSTC)