#### Latest posts by Fatma Kocer-Poyraz (see all)

- Learning Spotlight: August 2016 - August 18, 2016
- Confessions of a Dataset - July 20, 2016
- One-Click Optimization - February 10, 2014

When we think of optimization, we naturally think of finding the design with minimum cost, maximum performance, etc. However, we can use optimization methods for objectives other than minimizing or maximizing. One of these use cases is to meet a set of target values; so we need neither to minimize a response nor maximize it. We instead need to make sure that it achieves the target value.

A typical application comprises model calibration problems, such as simulation material model calibration to match to test data. We can formulate these cases as minimizing the difference between sets of data leading us down the path to use optimization methods for efficient and effective solution- finding. How we calculate the difference may depend on the application, but a very common calculation that works well for most problems is to minimize the sum of normalized-difference-squared. Another common calculation may involve calculating the area between two curves.

The first aspect of solving such problems easily is the method selection. The second aspect is the ease of setup, and the third aspect is the post-processing. In HyperStudy, we have an objective function formulation called System Identification. System identification allows users to set target values to a number of responses and automatically creates the difference equation and uses it as the objective function. In the pos- processing site, HyperStudy lists the values of all objectives, the delta between them, and the targets and the normalized deltas. We will now go through an application that uses system identification for material model calibration.

**Description of the model:**

In this application we will be calibrating a ductile aluminum alloy modeled in a RADIOSS block as an elasto-plastic material using a Johnson-Cook model to match the test data. Our objective is to simulate the experimental tensile test for which we have strain-stress results and to find the simulation material model parameter values such that the strain-stress curve from simulation matches to the curve from experiment.

We start with creating the simulation model for the tensile test. For this purpose, a quarter of a standard tensile test specimen is modeled using symmetry conditions. Traction is applied to a specimen via an imposed velocity at the left-end.

The units are: mm, ms, g, N, MPa.

The material to be characterized is a 6063 T7 aluminum. It has an isotropic elasto-plastic behavior that can be reproduced by a Johnson-Cook model without damage (RADIOSS Block Law2), defined as follows:

**Model Calibration Process:**

In this study we define, as design variables, the parameters a, b, n, σ_{max} (maximum stress) and the Young’sModulus. For the simulation results, engineering strains will be obtained by dividing the displacement of node 1 by the reference length (75 mm), and engineering stresses will be obtained by dividing the force in section 1 by its initial surface (12 mm^{2}).

RADIOSS simulation with initial guesses of Young’s Modulus, yield stress (a), hardening modulus (b), hardening exponent (n), and maximum stress values of 60400 MPa, 110 MPa, 120 MPa, 0.15, 280 MPa, leads to significant differences between the test and simulation results as seen in the plot below.

The objective is to find the values for the five material properties so that these differences are minimized and, ideally, eliminated. We can achieve this by minimizing, if not eliminating, the differences between the simulation and the experimental values of:

- stress when strain = 0.02 (experiment value and hence our target is 141 MPa)
- stress at necking point (experiment value and hence our target is 148 MPa)
- strain at necking point (experiment value and hence our target is 0.08)

We will use a special optimization problem formulation called “System Identification”to solve this problem. System identification minimizes the sum of normalized error-squared. Error is the difference between the target values and simulation results.

where f_{i}(x) is the i^{th} response obtained from analysis,

T_{i} is the target value for the i^{th} response.

Note that, in HyperStudy we do not need to enter this equation manually. We can simply enter the target values for each response and use the “System Identification” objective type.

When doing design studies like this application, we recommend running a design of experiments (DoE) first. DoE helps us to understand the relations between responses and design variables. Looking at these relations, judging if they make sense allows us to validate our study setup. Furthermore, it allows us to screen the design variables that do not have significant effect on the responses from further, more costly studies, such as optimization.

In this case, we have used a full factorial DoE as the simulations were not long and we have five design variables leading us to 32 runs for a full factorial. For cases where the simulations are longer and/or have more design variables, fractional factorial or Plackett Burman can be used for a quicker DoE run.

From the DoE, we look at effects charts to investigate our design study.

The first thing we need to decide based on the effects charts is whether they make sense or not. These plots are showing that the first design variable, Young’s Modulus, does not have a significant effect on the responses. This makes sense since the Young’s Modulus controls the elastic part of the strain-stress curve and our responses are beyond that part. Similarly, for the last design variable, maximum stress, its effect on the responses is almost null, since its value does not impact the responses either. After these investigations, we can conclude that we can safely omit these two insignificant design variables from our studies.

Next we will set up a system identification study. This starts by adding an optimization approach. We will inactivate the two design variables. Then we will add three objective functions and set their types to system identification. We will set the targets to experimental values as 141, 148, 0.08 and run ARSM for optimization. This optimization run converged to responses 140, 146, 0.06 in five analyses. We would have liked better closeness to experimental values than this. When we look at the optimum design variable values, we will see that they have converged either to their lower or upper bound. If we relax these bounds, we can get a better correlation. We will also change the initial design of this second optimization to the optimum design from the first run. This optimization run converged to 140, 149, 0.08 in ten analyses. Below you can see that the experimental curve and the optimum design curve match very well.

In this article, we proposed an effective process of calibrating multiple models using design of experiments (DoE) and optimization methods in HyperStudy. We applied this process to calibrate a ductile aluminum alloy modeled in a RADIOSS block as an elasto-plastic material using a Johnson-Cook model to match to test data. Our objective was to simulate the experimental tensile test for which we have strain-stress results and to find the simulation material model parameter values such that the strain-stress curve from simulation matches to the curve from experiment.

Using design study methods such as DoE and optimization instead of trial-and-error, which is the traditional way of doing model calibration, we can reduce the calibration time and increase the calibration quality. HyperStudy has special optimization formulations, such as system identification to speed up the design study setup process. Furthermore, you can easily register other functions to HyperStudy if you’d like to do model calibration using your own math. Note that HyperStudy is solver-independent and can also work with applications running other solvers, such as LS-Dyna, Abaqus, Ansys, Adams, etc.

If you want to learn more about how to setup this kind of studies please watch the e-learning video clip in the Feature Learning session, showing a sneak peek of the soon-to-be-released HyperStudy 12.0.