pyFAI.io package¶
pyFAI.io.image module¶
Module function to read images.
-
pyFAI.io.image.
read_data
(image_path)¶ Returns a numpy.array image from a file name or a URL.
Parameters: image_path (str) – Path of the image file
Return type: numpy.ndarray regardless the dimention or the content
Raises: - IOError – if the data is not reachable
- TypeError – if the data is not an image (wrong size, wrong dimension)
-
pyFAI.io.image.
read_image_data
(image_path)¶ Returns a numpy.array image from a file name or a URL.
Parameters: image_path (str) – Path of the image file
Return type: numpy.ndarray
Raises: - IOError – if the data is not reachable
- TypeError – if the data is not an image (wrong size, wrong dimension)
pyFAI.io.integration_config module¶
Module function to manage poni files.
-
class
pyFAI.io.integration_config.
ConfigurationReader
(config)¶ Bases:
object
-
__init__
(config)¶ Parameters: config – dictonary
-
pop_detector
()¶ Returns the detector stored in the json configuration.
Return type: pyFAI.detectors.Detector
-
pop_method
(default=None)¶ Returns a Method from the method field from the json dictionary.
Return type: pyFAI.method_registry.Method
-
pop_ponifile
()¶ Returns the geometry subpart of the configuration
-
-
pyFAI.io.integration_config.
normalize
(config, inplace=False)¶ Normalize the configuration file to the one supported internally (the last one).
Parameters: - config (dict) – The configuration dictionary to read
- inplace (bool) – In true, the dictionary is edited inplace
Raises: ValueError – If the configuration do not match.
pyFAI.io.nexus module¶
Module for writing HDF5 in the Nexus style
-
class
pyFAI.io.nexus.
Nexus
(filename, mode=None, creator=None)¶ Bases:
object
Writer class to handle Nexus/HDF5 data
Manages:
entry
pyFAI-subentry
- detector
TODO: make it thread-safe !!!
-
__init__
(filename, mode=None, creator=None)¶ Constructor
Parameters: - filename – name of the hdf5 file containing the nexus
- mode – can be ‘r’, ‘a’, ‘w’, ‘+’ ….
- creator – set as attr of the NXroot
-
close
(end_time=None)¶ close the filename and update all entries
-
deep_copy
(name, obj, where='/', toplevel=None, excluded=None, overwrite=False)¶ perform a deep copy: create a “name” entry in self containing a copy of the object
Parameters: - where – path to the toplevel object (i.e. root)
- toplevel – firectly the top level Group
- excluded – list of keys to be excluded
- overwrite – replace content if already existing
-
find_detector
(all=False)¶ Tries to find a detector within a NeXus file, takes the first compatible detector
Parameters: all – return all detectors found as a list
-
flush
()¶
-
classmethod
get_attr
(dset, name, default=None)¶ Return the attribute of the dataset
Handles the ascii -> unicode issue in python3 #275
Parameters: - dset – a HDF5 dataset (or a group)
- name – name of the attribute
- default – default value to be returned
Returns: attribute value decoded in python3 or default
-
get_class
(grp, class_type='NXcollection')¶ return all sub-groups of the given type within a group
Parameters: - grp – HDF5 group
- class_type – name of the NeXus class
-
get_data
(grp, class_type='NXdata')¶ return all dataset of the the NeXus class NXdata WRONG, do not use…
Parameters: - grp – HDF5 group
- class_type – name of the NeXus class
-
get_dataset
(grp, attr=None, value=None)¶ return list of dataset of the group matching the given attribute having the given value
Parameters: - grp – HDF5 group
- attr – name of an attribute
- value – requested value for the attribute
Returns: list of dataset
-
get_default_NXdata
()¶ Return the default plot configured in the nexus structure.
Returns: the group with the default plot or None if not found
-
get_entries
()¶ retrieves all entry sorted the latest first.
Returns: list of HDF5 groups
-
get_entry
(name)¶ Retrieves an entry from its name
Parameters: name – name of the entry to retrieve Returns: HDF5 group of NXclass == NXentry
-
new_class
(grp, name, class_type='NXcollection')¶ create a new sub-group with type class_type :param grp: parent group :param name: name of the sub-group :param class_type: NeXus class name :return: subgroup created
-
new_detector
(name='detector', entry='entry', subentry='pyFAI')¶ Create a new entry/pyFAI/Detector
Parameters: - detector – name of the detector
- entry – name of the entry
- subentry – all pyFAI description of detectors should be in a pyFAI sub-entry
-
new_entry
(entry='entry', program_name='pyFAI', title=None, force_time=None, force_name=False)¶ Create a new entry
Parameters: - entry – name of the entry
- program_name – value of the field as string
- title – description of experiment as str
- force_time – enforce the start_time (as string!)
- force_name – force the entry name as such, without numerical suffix.
Returns: the corresponding HDF5 group
-
new_instrument
(entry='entry', instrument_name='id00')¶ Create an instrument in an entry or create both the entry and the instrument if
-
pyFAI.io.nexus.
from_isotime
(text, use_tz=False)¶ Parameters: text – string representing the time is iso format
-
pyFAI.io.nexus.
get_isotime
(forceTime=None)¶ Parameters: forceTime (float) – enforce a given time (current by default) Returns: the current time as an ISO8601 string Return type: string
-
pyFAI.io.nexus.
is_hdf5
(filename)¶ Check if a file is actually a HDF5 file
Parameters: filename – this file has better to exist
pyFAI.io.ponifile module¶
Module function to manage poni files.
-
class
pyFAI.io.ponifile.
PoniFile
(data=None)¶ Bases:
object
-
__init__
(data=None)¶ Initialize self. See help(type(self)) for accurate signature.
-
as_dict
()¶
-
detector
¶ Return type: Union[None,float]
-
dist
¶ Return type: Union[None,float]
-
poni1
¶ Return type: Union[None,float]
-
poni2
¶ Return type: Union[None,float]
-
read_from_dict
(config)¶ Initialize this object using a dictionary.
Note
The dictionary is versionned.
-
read_from_duck
(duck)¶ Initialize the object using an object providing the same API.
The duck object must provide dist, poni1, poni2, rot1, rot2, rot3, wavelength, and detector.
-
read_from_file
(filename)¶
-
rot1
¶ Return type: Union[None,float]
-
rot2
¶ Return type: Union[None,float]
-
rot3
¶ Return type: Union[None,float]
-
wavelength
¶ Return type: Union[None,float]
-
write
(fd)¶ Write this object to an open stream.
-
pyFAI.io.sparse_frame module¶
Module for writing sparse frames in HDF5 in the Nexus style
-
pyFAI.io.sparse_frame.
save_sparse
(filename, frames, beamline='beamline', ai=None, source=None, extra={})¶ Write the list of frames into a HDF5 file
Parameters: - filename – name of the file
- frames – list of sparse frames (as built by sparsify)
- beamline – name of the beamline as text
- ai – Instance of geometry or azimuthal integrator
- source – list of input files
- extra – dict with extra metadata
Returns: None
Module contents¶
Module for “high-performance” writing in either 1D with Ascii , or 2D with FabIO or even nD with n varying from 2 to 4 using HDF5
Stand-alone module which tries to offer interface to HDF5 via H5Py and capabilities to write EDF or other formats using fabio.
Can be imported without h5py but then limited to fabio & ascii formats.
TODO:
- Add monitor to HDF5
-
class
pyFAI.io.
AsciiWriter
(filename=None, prefix='fai_', extension='.dat')¶ Bases:
pyFAI.io.Writer
Ascii file writer (.xy or .dat)
-
__init__
(filename=None, prefix='fai_', extension='.dat')¶
-
init
(fai_cfg=None, lima_cfg=None)¶ Creates the directory that will host the output file(s)
-
write
(data, index=0)¶ To be implemented
-
-
class
pyFAI.io.
DefaultAiWriter
(filename, engine=None)¶ Bases:
pyFAI.io.Writer
-
__init__
(filename, engine=None)¶ Constructor of the historical writer of azimuthalIntegrator.
Parameters: - filename – name of the output file
- ai – integrator, should provide make_headers method.
-
close
()¶
-
flush
()¶ To be implemented
-
init
(fai_cfg=None, lima_cfg=None)¶ Creates the directory that will host the output file(s) :param fai_cfg: configuration for worker :param lima_cfg: configuration for acquisition
-
make_headers
(hdr='#', has_mask=None, has_dark=None, has_flat=None, polarization_factor=None, normalization_factor=None, metadata=None)¶ Parameters: - hdr (str) – string used as comment in the header
- has_dark (bool) – save the darks filenames (default: no)
- has_flat (bool) – save the flat filenames (default: no)
- polarization_factor (float) – the polarization factor
Returns: the header
Return type: str
-
save1D
(filename, dim1, I, error=None, dim1_unit='2th_deg', has_mask=None, has_dark=False, has_flat=False, polarization_factor=None, normalization_factor=None, metadata=None)¶ This method save the result of a 1D integration as ASCII file.
Parameters: - filename (str) – the filename used to save the 1D integration
- dim1 (numpy.ndarray) – the x coordinates of the integrated curve
- I (numpy.mdarray) – The integrated intensity
- error (numpy.ndarray or None) – the error bar for each intensity
- dim1_unit (pyFAI.units.Unit) – the unit of the dim1 array
- has_mask – a mask was used
- has_dark – a dark-current was applied
- has_flat – flat-field was applied
- polarization_factor (float, None) – the polarization factor
- normalization_factor (float, None) – the monitor value
- metadata – JSON serializable dictionary containing the metadata
-
save2D
(filename, I, dim1, dim2, error=None, dim1_unit='2th_deg', has_mask=None, has_dark=False, has_flat=False, polarization_factor=None, normalization_factor=None, metadata=None, format_='edf')¶ This method save the result of a 2D integration.
Parameters: - filename (str) – the filename used to save the 2D histogram
- dim1 (numpy.ndarray) – the 1st coordinates of the histogram
- dim1 – the 2nd coordinates of the histogram
- I (numpy.mdarray) – The integrated intensity
- error (numpy.ndarray or None) – the error bar for each intensity
- dim1_unit (pyFAI.units.Unit) – the unit of the dim1 array
- has_mask – a mask was used
- has_dark – a dark-current was applied
- has_flat – flat-field was applied
- polarization_factor (float, None) – the polarization factor
- normalization_factor (float, None) – the monitor value
- metadata – JSON serializable dictionary containing the metadata
- format – file-format to be used (FabIO format)
-
set_filename
(filename)¶ Define the filename while will be used
-
write
(data)¶ Minimalistic method to limit the overhead.
Parameters: data – array with intensities or tuple (2th,I) or (I,2th,chi) :type data: Integrate1dResult, Integrate2dResult
-
-
class
pyFAI.io.
FabioWriter
(filename=None)¶ Bases:
pyFAI.io.Writer
Image file writer based on FabIO
TODO !!!
-
__init__
(filename=None)¶
-
init
(fai_cfg=None, lima_cfg=None, directory='pyFAI')¶ Creates the directory that will host the output file(s)
-
write
(data, index=0)¶ To be implemented
-
-
class
pyFAI.io.
HDF5Writer
(filename, hpath=None, entry_template=None, fast_scan_width=None, append_frames=False, mode='error')¶ Bases:
pyFAI.io.Writer
Class allowing to write HDF5 Files.
-
CONFIG
= 'configuration'¶
-
DATASET_NAME
= 'data'¶
-
MODE_APPEND
= 'append'¶
-
MODE_DELETE
= 'delete'¶
-
MODE_ERROR
= 'error'¶
-
MODE_OVERWRITE
= 'overwrite'¶
-
__init__
(filename, hpath=None, entry_template=None, fast_scan_width=None, append_frames=False, mode='error')¶ Constructor of an HDF5 writer:
Parameters: - filename (str) – name of the file
- hpath (str) – Name of the entry group that will contains the NXprocess.
- entry_template (str) – Formattable template to create a new entry (if hpath is not specified)
- fast_scan_width (int) – set it to define the width of
-
close
()¶
-
flush
(radial=None, azimuthal=None)¶ Update some data like axis units and so on.
Parameters: - radial – position in radial direction
- azimuthal – position in azimuthal direction
-
init
(fai_cfg=None, lima_cfg=None)¶ Initializes the HDF5 file for writing :param fai_cfg: the configuration of the worker as a dictionary
-
set_hdf5_input_dataset
(dataset)¶ record the input dataset with an external link
-
write
(data, index=None)¶ Minimalistic method to limit the overhead. :param data: array with intensities or tuple (2th,I) or (I,2th,chi)
-
-
class
pyFAI.io.
Writer
(filename=None, extension=None)¶ Bases:
object
Abstract class for writers.
-
CONFIG_ITEMS
= ['filename', 'dirname', 'extension', 'subdir', 'hpath']¶
-
__init__
(filename=None, extension=None)¶ Constructor of the class
-
flush
(*arg, **kwarg)¶ To be implemented
-
init
(fai_cfg=None, lima_cfg=None)¶ Creates the directory that will host the output file(s) :param fai_cfg: configuration for worker :param lima_cfg: configuration for acquisition
-
setJsonConfig
(json_config=None)¶ Sets the JSON configuration
-
write
(data)¶ To be implemented
-