result:migrate to h5dpf
Version: 0.0.0
Description
Read mesh properties from the results files contained in the streams or data sources and make those properties available through a mesh selection manager in output.User can input a GenericDataContainer that will map an item to a result name. Example of Map: {{ default: wf1}, {EUL: wf2}, {ENG_SE: wf3}}.
Inputs
| Input | Name | Expected type(s) | Description |
|---|---|---|---|
| Pin -7 | h5_chunk_size | int32, generic_data_container | Size of each HDF5 chunk in kilobytes (KB). Default: 1 MB when compression is enabled; for uncompressed datasets, the default is the full dataset size x dimension. |
| Pin -5 | dataset_size_compression_threshold | int32, generic_data_container | Integer value that defines the minimum dataset size (in bytes) to use h5 native compression Applicable for arrays of floats, doubles and integers. |
| Pin -2 | h5_native_compression | int32, abstract_data_tree, generic_data_container | Integer value / DataTree that defines the h5 native compression used For Integer Input {0: No Compression (default); 1-9: GZIP Compression : 9 provides maximum compression but at the slowest speed.} For DataTree Input {type: None / GZIP / ZSTD; level: GZIP (1-9) / ZSTD (1-20); num_threads: ZSTD (>0)} |
| Pin -1 | export_floats | bool, generic_data_container | Converts double to float to reduce file size (default is true).If False, nodal results are exported as double precision and elemental results as single precision. |
| Pin 0 Required | filename | string | filename of the migrated file |
| Pin 1 | comma_separated_list_of_results | string | list of results (source operator names) separated by semicolons that will be stored. (Example: U;S;EPEL). If empty, all available results will be converted. |
| Pin 2 | all_time_sets | bool | Deprecated. Please use filtering workflows instead to select time scoping. Default is false. |
| Pin 3 | streams_container | streams_container | streams (result file container) (optional) |
| Pin 4 | data_sources | data_sources | if the stream is null then we need to get the file path from the data sources |
| Pin 6 | compression_workflow | workflow, generic_data_container | BETA Option: Applies input compression workflow. |
| Pin 7 | filtering_workflow | workflow, generic_data_container | Applies input filtering workflow. |
Outputs
| Output | Name | Expected type(s) | Description |
|---|---|---|---|
| Pin 0 | migrated_file | data_sources |
Configurations
| Name | Expected type(s) | Default value | Description |
|---|---|---|---|
| mutex | bool | false | If this option is set to true, the shared memory is prevented from being simultaneously accessed by multiple threads. |
Scripting
Category: result
Plugin: core
Scripting name: migrate_to_h5dpf
Full name: result.migrate_to_h5dpf
Internal name: hdf5::h5dpf::migrate_file
License: None
Changelog
- Version 0.0.0: Initial release.