nabu.stitching.utils.utils module

class nabu.stitching.utils.utils.ShiftAlgorithm(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: Enum

All generic shift search algorithm

NABU_FFT = 'nabu-fft'
SKIMAGE = 'skimage'
ITK_IMG_REG_V4 = 'itk-img-reg-v4'
NONE = 'None'
CENTERED = 'centered'
GLOBAL = 'global'
SLIDING_WINDOW = 'sliding-window'
GROWING_WINDOW = 'growing-window'
SINO_COARSE_TO_FINE = 'sino-coarse-to-fine'
COMPOSITE_COARSE_TO_FINE = 'composite-coarse-to-fine'
nabu.stitching.utils.utils.find_frame_relative_shifts(overlap_upper_frame: ndarray, overlap_lower_frame: ndarray, estimated_shifts: tuple, overlap_axis: int, x_cross_correlation_function=None, y_cross_correlation_function=None, x_shifts_params: dict | None = None, y_shifts_params: dict | None = None)[source]
Parameters:

overlap_axis – axis in [0, 1] on which the overlap exists. In image space. So 0 is aka y and 1 as x

nabu.stitching.utils.utils.find_volumes_relative_shifts(upper_volume: VolumeBase, lower_volume: VolumeBase, overlap_axis: int, estimated_shifts, dim_axis_1: int, dtype, flip_ud_upper_frame: bool = False, flip_ud_lower_frame: bool = False, slice_for_shift: int | str = 'middle', x_cross_correlation_function=None, y_cross_correlation_function=None, x_shifts_params: dict | None = None, y_shifts_params: dict | None = None, alignment_axis_2='center', alignment_axis_1='center')[source]
Parameters:

dim_axis_1 (int) – axis 1 dimension (to handle axis 1 alignment)

nabu.stitching.utils.utils.find_projections_relative_shifts(upper_scan: TomoScanBase, lower_scan: TomoScanBase, estimated_shifts: tuple, axis: int, flip_ud_upper_frame: bool = False, flip_ud_lower_frame: bool = False, projection_for_shift: int | str = 'middle', invert_order: bool = False, x_cross_correlation_function=None, y_cross_correlation_function=None, x_shifts_params: dict | None = None, y_shifts_params: dict | None = None) tuple[source]

deduce the relative shift between the two scans. Expected behavior: * compute expected overlap area from z_translations and (sample) pixel size * call an (optional) cross correlation function from the overlap area to compute the x shift and polish the y shift from projection_for_shift

Parameters:
  • scan_0 (TomoScanBase) –

  • scan_1 (TomoScanBase) –

  • estimated_shifts (tuple) – ‘a priori’ shift estimation

  • axis (int) – axis on which the overlap / stitching is happening. In the 3D space (sample, detector referential)

  • flip_ud_upper_frame (bool) – is the upper frame flipped

  • flip_ud_lower_frame (bool) – is the lower frame flipped

  • projection_for_shift (Union[int,str]) – index fo the projection to use (in projection space or in scan space ?. For now in projection) or str. If str must be in (middle, first, last)

  • invert_order (bool) – are projections inverted between the two scans (case if rotation angle are inverted)

  • x_cross_correlation_function (str) – optional method to refine x shift from computing cross correlation. For now valid values are: (“skimage”, “nabu-fft”)

  • y_cross_correlation_function (str) – optional method to refine y shift from computing cross correlation. For now valid values are: (“skimage”, “nabu-fft”)

  • x_shifts_params – parameters to find the shift over x

  • y_shifts_params – parameters to find the shift over y

Returns:

relative shift of scan_1 with scan_0 as reference: (y_shift, x_shift)

Return type:

tuple

Warning:

this function will flip left-right and up-down frames by default. So it will return shift according to this information

nabu.stitching.utils.utils.find_shift_correlate(img1, img2, padding_mode='reflect')[source]
nabu.stitching.utils.utils.find_shift_with_itk(img1: ndarray, img2: ndarray) tuple[source]
nabu.stitching.utils.utils.from_slice_to_n_elements(slice_: slice | tuple)[source]

Return the number of element in a slice or in a tuple