Trends in Structural Analysis: FEA, Composites, HPC, and Beyond

HyperWorks provides a broad portfolio of multi-physics solvers covering mechanical, electromagnetic, fluid/thermal, and model-based development systems, all wrapped around optimization technologies. The following Q&A transcript summarizes thoughts of Uwe Schramm, Chief Technical Officer at Altair on different aspects to consider in simulation packages.

  1. What are the important characteristics of a structural FEA code?
  • A structural FEA code should be developed around optimization covering multi-physics built on a scalable architecture to take advantage of the HPC systems. Engineers should be able to run simultaneously many loading scenarios even introducing multi-physics loading sequences like temperature, fluid structure interaction (FSI). A wide variety of problems require solution sequences deployed in static, steady state, and transient analyses. It is important to have a large library of materials and failure models, preferably to combine in any which way (material and failure) to do complex physics. The software capability should be supported by a licensing scheme that allows massive design exploration, multi-run analysis, optimization and reliability based design.
  1. Composite materials are everywhere. What are some of the unique challenges to deal with composite materials from a simulation perspective?
  • Composite materials are playing an increasing role in various industry segments like aerospace, automotive, defense and space, marine, consumer products due to desirable characteristics of lightweight, corrosion resistance, high strength, and design flexibility among many others. One of the key challenges in modeling composite materials is the development of reliable material models for simulation. Unlike metals, composite material modeling requires a large number of tests to accurately characterize the material behavior. Engineers can overcome this challenge by relying on virtual models playing with different compositions of fiber and matrix and virtually test to come up with allowables. In addition to getting the right material data, modeling of composite materials in actual macro sized component can be a computational bottleneck that is addressed by using a multiscale approach. Manufacturing processes also influence the material behavior as the fibers are oriented in different ways affecting the performance. Porosity and residual stresses after heat treatment also play a significant role in the final behavior, so simulating the manufacturing process before doing performance evaluation is necessary. As composites are built layer by layer, determination of the stacking sequence, right thicknesses, and proper orientations need to be determined intelligently based on service requirements. Finally predicting the failure modes for composite materials is a difficult area that should be addressed with techniques like extended finite element method (XFEM) and multiscale approach.

Woven composite unit cell modeled in Multiscale Designer

  1. How does topology optimization evolve in a simulation driven design philosophy?
  • Topology optimization helps in simulation driven design where doing upfront CAE work results in better architectures and gets a good starting point. More and more usage of this design paradigm will be adopted by customers. The core of simulation driven design will be around also in the future, won’t go away and will be covering more physics, more manufacturing constraints, and other constraints like fail-safe for building redundant structures to mitigate sudden failures.

Failsafe topology optimization in OptiStruct

  1. With increasing model sizes every day, how do cloud computing and HPC environments come in to play?
  • Cloud is another compute resource. It offers accessibility to compute resources, more flexibility, and better data security. Stand alone and cloud resources will blend and mix in future, and there will be virtually no distinction between these two. Cloud is a logical development of interconnection between computing resources.
  • Model sizes keep increasing at a rapid rate, but it’s never too much for HPC. It can only be too small. The real question is when is the model too small? There’s no upper limit. Certain methods and algorithms like Lattice Boltzmann (LBM) or Smooth Particle Hydrodynamics (SPH) for CFD applications are especially suited for parallelization. These algorithms are embarrassingly parallel, as you can put one degree-of-freedom (dof) on a core and scale up easily with larger models. This may not be possible in certain applications where the intimate model information at the smallest discretization need to be evaluated as in an impact event with explicit solvers, however algorithms can be thought up to break down models in very small pieces that evenly load the compute cores. Other types of problems like multi-body dynamics only run tens of degrees of freedom and may not need to be parallelized as they are quick computations. There is no upper limit on HPC effectiveness. The algorithm can have a limit that can be overcome technologically by programming and rewriting it.

RADIOSS crash simulation in HPC and Cloud environments

  1. Democratization of simulation tools makes it accessible to a large audience. What considerations need to be evaluated to make the tools used by designers who may not have deep technical or engineering backgrounds?
  • Yes for making application used by designers, with some qualifications. It all goes back how the virtual model is built. Democratization is a good concept as it gives access to evaluation for many people, but the virtual model should be a close representation of the physical prototype so you see realistic behavior. Human interaction with the virtual prototype should be same as physical. The closer the virtual prototype to physical prototype, the better it is in predicting performance and reducing errors. We are still bound by the limitation of the computation. But thinking in terms of future technology, the virtual world will replace the physical world. Product creators can test and develop new products with virtual prototyping. There are certain enablers that make democratization prevalent – automation, enhanced and intuitive user experience, portable technology, faster computations for near real time evaluation. Many tasks have been automated that don’t require deep technical knowledge. In cases where interpretation is in play, optimization algorithms can help reduce the dependence on user actions and judgments.
  1. List a couple of trends to address in simulation packages to meet next product development needs?
  • In the age of IoT, there is a need to connect to other devices, not just computers that execute computations as a simulation, but connect to physical data that is collected with sensors. That’s the new thing in the context of simulation with data analytics. Not just design, but maintenance decisions will be based not on physical inspection, and from data obtained in service or operation. A simulation model is combined with physical data offering feedback for future design directions as well as current maintenance.

Altair’s Digital Twin Platform

  • User experience, both from the in-built settings and that of external tools interface with software like solvers. The user experience can limit a capable solver functionality if not designed properly. Trend towards portable devices should weigh in when considering ubiquitous access of technology.

 

Structural Analysis of a Laminated Composites Daggerboard with OptiStruct

Sridhar Ravikoti

Sridhar Ravikoti

Technical Director - Global Partner Programs at Altair
Sridhar Ravikoti is the Technical Director of Global Partner Programs at Altair. He has been with Altair since 2000, gaining experience in engineering product development and software program management. In his current role as a technical lead for the Altair Partner Alliance, Sridhar drives a synergetic relationship between Altair offering and its Partners. He holds a Bachelor’s degree from Osmania University in India, and a Master’s degree from the University of Nebraska-Lincoln, with a major in Mechanical Engineering and a minor degree in Applied Mechanics.
Sridhar Ravikoti
Sridhar Ravikoti

About Sridhar Ravikoti

Sridhar Ravikoti is the Technical Director of Global Partner Programs at Altair. He has been with Altair since 2000, gaining experience in engineering product development and software program management. In his current role as a technical lead for the Altair Partner Alliance, Sridhar drives a synergetic relationship between Altair offering and its Partners. He holds a Bachelor’s degree from Osmania University in India, and a Master’s degree from the University of Nebraska-Lincoln, with a major in Mechanical Engineering and a minor degree in Applied Mechanics.