Introduction to Post Computations#

This mode of operation is used to apply post calculations on results obtained by the finite element method, and with simulations. Because the data input for post processing are in the calculation results files, the file name resulting from the post-processor are constructed from the original name, by adding p to the different file name extensions. For example, the treatment of integration points in the problem problem.integ will produce a file named problem.integp. A file problem.utp is automatically generated in order to allow graphical visualization of the new variables.

All of the results (calculation and post-calculations) are simultaneously available in the Zmaster graphical program, and the Zmaster batch program.

../../_images/pp-us.fig.svg

Two basic types of operations are defined:

  • A post treatment so-called local which is used to evaluate a criterion at nodal or Gauss point locations throughout the mesh, or in a part of it. The calculation is based only on the data contained at the point(s) of concern.

  • Post calculations which are global in that they are used to evaluate a criterion over the whole structure, or in a sub-part of it, at a given instant. The calculation therefore normally accounts for data distributed in space.

../../_images/cube-us.fig.svg

Local post-processing#

The local post computations is used for example to predict the component life of a structure using simple monotonic loading (e.g. Rice-Tracey models), under creep or fatigue (low or high cycle models). It can also used used to determine some derived data relative to the local history at each point, such as the maximum stress temperature or temperature during the loading history.

According to the users choice, the computation will be made at integration (Gauss) points (from the file problem.integ) or at nodal points from data originally saved at the nodal points (files problem.node and problem.ctnod).

The local post treatments give the user the ability to make connections between Z-set computations and other codes as well. This can be achieved for example using the format command to create formatted ASCII text output, or using and external program or script to calculate a particular user post computation. Note also that the post-processor is fully extensible using the Z-set plug-ins.

Global post-processing#

Global post processing is applied when, for example, one wishes to calculate the mean values of a variable in the entire structure. The majority of the time, the structure’s geometry must be known, and an integration volume is defined. Another example is a Weibull criterion for brittle fracture prediction.

The majority of global post computations produce output which is reduced to a scalar value for each time step of the calculation. These values are stored with some description in an ASCII file problem.post. Nevertheless, some of the global post processors produce local results (all the time using a non-local algorithm), which will be stored in the appropriate file (node or integration values).

General rules#

For the two types of post-calculation, the user defines the context operation: groups of elements, of nodes, or of Gauss points on which to calculate. The output numbers may also be taken into account. The context can be redefines during the calculation. It is important to remember that a given computation or criterion will be applied to the last context which was defined. The individual computations will be applied one after another in the order of appearance in the input file. One can alternate arbitrarily local and global post computations.

The variables for which the post computation is applied can be:

  • FEA output: their names appear in the file problem.ut

  • Post-computation output calculated in the course of the same execution., That is, once a post computation is applied, the new variables generated by that application are henceforth added to the list of available variables.

Recursive computations are available using the newly generated variables in post-computations. An example would be to calculate an equivalent strain as the combination of strains at each point, and then calculate the maximum of these equivalent strains. The global results (stored in the file problem.post) are not however re-usable.

The data source#

In version 8.3 and greater Z-post has the capability to run off of different data source formats than the default Zebulon solver files. Notably there is the possibility to import from ABAQUS, ANSYS Z-sim and arbitrary ASCII files. With this mesher commands can be applied to the results files as necessary to allow new selections to be available. The commands to do that import are described in the section for ***data_source on page . An example showing typical use follows.

****post_processing
 ***data_source rst                    % load the ansys results
  **open t-base_model_fillet.rst       % and write a native
  **write_geof                         % mesh file (GEOF format)
  **elset LOOK                         % with some set generation
   *elements
     2205 2163 311 325 339 353 381 367 395 409 423 437 479 521
  **nset edge1
   *nodes
    2396 2397 1 2394 2395 2393 2392 2390 2391 2389 2388 2386 2387
    2385 2384 2382 2383 2381 2380 2379 2373
  **bset edge1
   *use_nset edge1
   *use_dimension 2
 ***precision 3
  %----------------------------------------
 ***local_post_processing              % here we copy the displacements
  **output_number 1-999              % so the post generated views
  **file node                        % can be deformed
  **output_to_node
  **nset ALL_NODE
  **process copy *list_var U1 U2 U3
 ***local_post_processing              %
  **output_number 1-999                % Ansys results are contours
  **file ctele   **elset ALL_ELEMENT   % by element (CTELE)
  **material_file  post1.inp           % where to find coefs.. (here we
                                        % use the same physical file)
  **process transform_frame
   *tensor_variables sig
   *output_variables sigm              % "sigma-material"
   *use_element_rotations
 ***return

The data output#

Z-post has the capability to output to a number of different output formats, with multiple selections of localized output formats from your posts.

In every case, the standard “-p” files will also be written because these files are used for access to intermediate variables in the course of several posts relying on each other. An example use of this post processing with an ansys input and abaqus output follows:

****post_processing
 ***data_source rst                    % load ansys rst format
  **open ansys_model.rst              % in the file ansys_model.rst

 ***data_output odb
   *problem_name abaqus_post_results   % makes abaqus_post_results.odb file
   *elset outer_surface                % make odb of this elset only
 ***local_post_processing
    ...

Please note that the data outputting also consist of a format for zebulon files. This output format has following additional features which may be preferable in some cases:

  • the data is written to disk only in a single large write, without seek calls which can cause network latency.

  • the output can be specified to a list of elsets, from which a reduced mesh is given. this data output can therefore be used to reduce storage size.

  • can be used to write to a “zebulon” calculation format (non-p suffixed files) and therefore can form the basis of a subsequent series of post calculations.

Interface with Zmaster#

The post calculations has been embedded in Zmaster in addition to their use by generating additional output files. That is, the post process computations can be run directly on the selected data in Zmaster. After adding a post computation in the Zmaster environment, the .mast file needs to be re-saved.

Material coefficients#

Many of the post computation models use coefficients to specify parameters which vary for different materials. There are 2 ways of specifying these coefficients, each of which may be more convenient for different situations.

The first method is to have the material data separated from the process definition. This allows re-use of the coefficients for multiple computations by creating a repository of material values. An example input with two creep entries follows.

**material_file creep.inp
**process creep
 *var sig
**process creep
 *var sig
 *express_life_as time

Both creep process computations would then use the creep entry in a material file creep.inp:

  ***post_processing_data
   **process creep
     r 5.
     A 1500.
     S0 0.
***return

Another method is to simply put the coefficients in-line with the process command. This is obviously easier for “one-off” post processing:

**process creep
 *var sig
 *model_coef
   r     5.0
   A  1500.0
   S0    0.0

Pointers and stacked post computations#

Like all of the Z-set packages, the post processing software is build on objects which perform specific tasks. In the post case however there are a large number of cases where certain calculations such as a thermo-mechanical fatigue analysis employing plastic range, oxidation, and creep damages will need to re-use other computations which themselves can act as stand-alone computations. This is to say, very often a post computation will require pointers to other post input files defining those additional procedures. The following is a detailed example of such input (from the test lcf.inp in Post_test/INP

The analysis is defined by the “primary” section, which will be the first instance of ****post_processing in the input file (or \(N\)-th instance in the case of an -N input switch).

****post_processing
 ***local_post_processing
  **file integ
  **elset ALL_ELEMENT

  **material_file ../MAT/test_simple
  **process LCF             % LCF auto creates a stress range making NC_S NF_S
   *mode NLC_ONERA         % creates NR_NLC_ONERA
   *fatigue fatigue_S 2    % this means use fatigue_S post
   *creep   creep     2    % and creep post in post .inp file #2

  **material_file ../MAT/test_with_a
  **process LCF             % there is already NC_S NF_S, new are NC_S_n1 NF_S_n1
   *mode NLC_ONERA         % creates NR_NLC_ONERA_n1
   *fatigue fatigue_S 3    % look in 3rd ****post_processing segment
   *creep   creep     3

  **material_file ../MAT/test_simple_norm
  **process LCF             % now NC_S_n2 NF_S_n2
   *mode NLC_ONERA         % NR_NLC_ONERA_n2
   *fatigue fatigue_S 4
   *creep   creep     4

  **material_file ../MAT/test_with_a_norm
  **process LCF             % and finally NC_S_n3 NF_S_n3
   *mode NLC_ONERA         % NR_NLC_ONERA_n2
   *fatigue fatigue_S 5
   *creep   creep     5

 ***global_post_processing    % average values are taken for the validation
  **output_number 1         % purposes. One can look at a field in Zmaster
  **process average
   *list_var NC_S NF_S NF_S_n1 NF_S_n2 NF_S_n3
  **process average
   *list_var NR_NLC_ONERA NR_NLC_ONERA_n1 NR_NLC_ONERA_n2 NR_NLC_ONERA_n3
****return

Note that the active material file is being switched throughout the input structure, and each calculation uses a pointer for the *fatigue and *creep parts. That is, the LCF model is a method of combining fatigue and creep damages (with for example the nonlinear combination method of ONERA), and the ways that those damages are calculated is defined elsewhere. In this sense we are following the “object-oriented” sense found in all of Z-set, and the damages are being treated as abstractions. They become concrete through the additional objects instanced in the different pointed to data files.

For this example, the following sections are included in the same input file, after the above “primary” section.

%section 2
test_simple
****post_processing
    **process fatigue_S
      *var sig
      *mode simple
    **process creep
      *var sig
****return

%section 3
test_with_a
****post_processing
    **process fatigue_S
      *var sig
      *mode with_a
    **process creep
      *var sig
****return

%section 4
test_simple_norm
****post_processing
    **process fatigue_S
      *var sig
      *mode simple
      *normalized_coeff
    **process creep
      *var sig
****return

%section 5
test_with_a_norm
****post_processing
    **process fatigue_S
      *mode with_a
      *var sig
      *normalized_coeff
    **process creep
      *var sig
****return

For the material files, there will be a succession of **process sections with the types corresponding to the types of calculation in the input file. For example, in the file ../MAT/test_simple referenced above we have:

%
% fatigue coefficients for the formula without a, and
% non normed coefficients
%
***post_processing_data
 **process fatigue_S      % coefficients relating to the
   M          14782.065   % fatigue_S defined in %section 2
   beta       2.5
   sigma_l    40.5
   sigma_u   170.
   b1        0.0003
   b2        0.0003
**process creep          % coefficients relating to
   A 452.                % creep defined in %section 2
   S0 0.
   r 10.
   k 30.
**process LCF            % coefficients controlling the LCF
   a   0.1               % from section 1, mode NLC_ONERA
   k   30.
   beta  2.5
   beta  2.5
***return

In the following LCF sections, there is a similar coefficient file with similar structure. These different files get ready because of the sequence of **material_file options swapping the effective material file name as the main input is read.