nabu.stitching.stitcher.single_axis module¶
- class nabu.stitching.stitcher.single_axis.SingleAxisStitcher(configuration, *args, **kwargs)[source]¶
Bases:
_StitcherBase
Any single-axis base class
- property axis: int¶
- property dumper¶
- property stitching_axis_in_frame_space¶
//tomo.gitlab-pages.esrf.fr/bliss-tomo/master/modelization.html)
- Type:
stitching is operated in 2D (frame) space. So the axis in frame space is different than the one in 3D ebs-tomo space (https
- stitch(store_composition: bool = True) BaseIdentifier [source]¶
Apply expected stitch from configuration and return the DataUrl of the object created
- Parameters:
store_composition (bool) – if True then store the composition used for stitching in frame_composition. So it can be reused by third part (like tomwer) to display composition made
- property serie_label: str¶
return serie name for logs
- get_final_axis_positions_in_px() dict [source]¶
compute the final position (in pixel) from the initial position of the first object and the final relative shift computed (1) (1): the final relative shift is obtained from the initial shift (from motor position of provided by the user) + the refinement shift from cross correlation algorithm :return: dict with tomo object identifier (str) as key and a tuple of position in pixel (axis_0_pos, axis_1_pos, axis_2_pos)
- settle_flips()[source]¶
User can provide some information on existing flips at frame level. The goal of this step is to get one flip_lr and on flip_ud value per scan or volume
- property series: Series¶
- property configuration: SingleAxisStitchingConfiguration¶
- property progress¶
- normalize_frame_by_sample(frames: tuple)[source]¶
normalize frame from a sample picked on the left or the right
- static stitch_frames(frames: tuple | ndarray, axis, x_relative_shifts: tuple, y_relative_shifts: tuple, output_dtype: ndarray, stitching_axis: int, overlap_kernels: tuple, dumper: DumperBase = None, check_inputs=True, shift_mode='nearest', i_frame=None, return_composition_cls=False, alignment='center', pad_mode='constant', new_width: int | None = None) ndarray [source]¶
shift frames according to provided shifts (as y, x tuples) then stitch all the shifted frames together and save them to output_dataset.
- Parameters:
frames (tuple) – element must be a DataUrl or a 2D numpy array
stitching_regions_hdf5_dataset –