integrator.processconfig module#
- integrator.processconfig.get_partial_output_file(output_file, scan_file, partial_output_files_subfolder=None)#
- class integrator.processconfig.ProcessingConfig(ai_config, computations_config, scans_list, output_files_list, conf_dict=None, partial_output_files_subfolder=None, workspace_directory=None)#
Bases:
object
- A class that holds information on
input datasets
final output files
partial output files
Azimuthal Integration parameters
Computation distribution parameters
- build_tasks_distribution()#
Build a list of task to assign to each worker.
- There are two levels of tasks subdivision:
Each “multi-processes worker” (MP) handles one or several scans (thousands of images). This MP spawns several workers (W).
Each worker (W) handles one or several acquisition file (hundreds of images per file). It saves results to partial files (to avoid parallel write).
- Returns:
workers_tasks (list of lists of tuple) – List of n_workers items, where each item if a list of (input_url, output_fname). i.e workers_tasks[worker_id] is a list of tuple in the form (input_url, output_fname)
scans_info (dict) – Dictionary holding information on all the scans. For each scan, scans_info[scan_url_txt] has several fields:
”scan_structure”: the associated HDF5Dataset object
”partial_scan_urls”: list of DataUrl: of files constituting the scan
”partial_output_files”: list of str: partial output files
”output_file”: str: final output file
- dump_tasks_to_disk(tasks)#
- integrator.processconfig.load_work_config(fname)#