Dag Brück
Technology architecture director, working on Dymola development in a broad sense since 1992. Also active in the SSP design group and in ProSTEP ivip Smart System Engineering. Former member of ISO WG21 C++.
Sessions
You will discover the latest news and perspectives regarding Dymola and underlying technologies, as well as the portfolio of libraries. Also included: latest standards support and examples of workflows involving native and web clients of the Dassault Systèmes offer.
We present two technologies for speeding up co-simulations under the FMI standards. By smoothing the input signals inside each FMU, the internal integrator may avoid re-initialization. This can significantly reduce the number of model and Jacobian evaluations. To further help the integrator we also propose a predictor compensation technique tailored to the input smoother. The main benefit of our technologies is the ease-of-use, requiring no model manipulations, nor any special co-simulation master algorithms. The technologies are implemented in Dymola~2025x and validated with both an academic mechanical model as well as thermo-fluid examples where we can observe performance gains with factor up to 100, and often around 5-10. One of these thermo-fluid examples is used in the \emph{OpenSCALING} research project to generate training data for constructing surrogate models, for which the input smoothing is especially important to speed up the dataset creation.
Extended abstract for a User Presentation at the SSP User Meeting. The potential of the SSP standard to describe system structures to drive an end-to-end credible simulation process from the definition of an abstract analysis architecture to the evaluation of the overall system behavior in a co-simulation setup, is evaluated in this application to a heat pump system. From practitioners perspective the benefits and short-comings are compared against current best practices using proprietary solutions.
Virtualization in development of ever more complex products is becoming increasingly important. Therefore, the use of modeling & simulation activities as part of product development and release is also increasing. Traceability is for these activities a keystone for quality tracking and reuse for efficiency. This presentation shows how this can be realized by applying the SSP Traceability layered standard (i.e., which information is involved related to the modeling and simulation activity, where does it come from and where shall it be propagated) in conjunction with the MIC Core standard for metadata. At the beginning, the Credible Simulation Process Framework developed as part of the prostep SmartSE project is presented, which enables integration into company processes. We will show this using the example of the challenges of developing a simulation model Due to different reasons, the development of a simulation model typically consists of the design and the implementation in a simulation tool (e.g. Dymola), the model being then immediately used for simulation purposes, skipping essential activities like the requirement specification (e.g. relevant application area, relevant physical effects, …), the documentation for the end-user and/or the model verification respectively the model validation. Often, the user documentation is written afterwards from the model implementation, leading, e.g. to potential transcription issues, inconsistent content respectively versions between implementation and documentation, and high review efforts to ensure the high quality of the developed model. We will then present the solution approach based on the SSP 2.0 and SSP Traceability Layered Standard and show it as a demonstration. Another important point for traceability and reuse is the use of standardized metadata to find and evaluate information. We will present a solution based on the MIC-Core Metadata Standard. We will close with an overview, which of these solution elements are implemented and publicly available.
This session is chaired by