Parameters estimation¶
When some parameters cannot be inferred directly from the dataset metadata (ex. center of rotation for reconstruction), one often have to determine them by trial and error. Nabu provide some estimation utilities, in the nabu.estimation module..
Center of Rotation¶
Nabu offers a variety of Center of Rotation (CoR) estimation methods, classified in two types:
projections-based methods
sinogram-based methods
Note
Currently, no reconstruction-based methods are available. However higher level tools like tomwer can be used.
Projection-based methods find the half-shift between two images. The center of axial vertical rotation is obtained when the fist image is a radiography at the rotation angle 0 and the second image is given by the radiography at the rotation angle 180 after flipping the image horizontally. The rotation axis position is the center of the image plus the found shift.
Configuration file: section [reconstruction]
, key rotation_axis_position
. Values can be:
Empty (default): the CoR is set to the middle of the detector:
(detector_width - 1)/2.0
A number (known CoR)
centered
: a fast and simple auto-CoR method. It only works when the CoR is not far from the middle of the detector. It does not work for half-tomography.global
: a slow but robust auto-CoR.sliding-window
: semi-automatically find the CoR with a sliding window. You have to specify on which side the CoR is (left, center, right).growing-window
: automatically find the CoR with a sliding-and-growing window. You can tune the option with the parameter ‘cor_options’.
API: CenterOfRotation
Advanced parameters in configuration file¶
Advanced parameters can be provided in the configuration file, with the keycor_options
. The parameters are separated by commas and passed as ‘name=value’. For example:
cor_options = low_pass=1; high_pass=20
Mind the semicolon separator (;). These advanced parameters corrrespond to the arguments of function find_shift.
Tilt angle¶
Nabu provides methods for detecting the detector tilt angle, or, equivalently, the rotation axis tilt in the plane parallel to the detector plane. When such a tilt occurs, the columns of the detector are not parallel to the rotation axis.
Configuration file: section [preproc]
, key tilt_correction
. Values can be:
Empty (default): no tilt detection/correction.
A scalar value: user-provided tilt angle in degrees.
1d-correlation
: auto-detect tilt with the 1D correlation method (fastest, but works best for small tilts)fft-polar
: auto-detect tilt with polar FFT method (slower, but works well on all ranges of tilts)
When a tilt is detected or provided by the user, each projection image is rotated to correct for this angle.
API: CameraTilt
Advanced parameters in configuration file¶
Advanced parameters can be provided in the configuration file, by the means of key autotilt_options
. The parameters are separated by commas and passed as ‘name=value’. For example:
autotilt_options = median_filt_shape(3,3); threshold=0.5
Mind the semicolon separator (;). These advanced parameters corrrespond to the arguments of function compute_angle.
The parameter threshold
indicates the value, in pixels, below which the effect of tilt is ignored. Given a tilt angle a
, the maximum effect on a detector of width N
is N/2 * sin(a)
. If N/2 * sin(a) < threshold
, then the tilt is considered too small for being corrected and is subsequently ignored. For example, on a detector of width 2560
pixels, a tilt of 0.05
degree induces a shift of 1.12 pixel at most. The default threshold is 0.25 pixel.
Detector Translation Along the Beam¶
When moving the detector along the longitudinal translation axis the beam shape image, recorded on the detector, can be seen moving if the translation is not parallel to the beam. The occurring shifts can be found, and the beam tilt respect to the translation axis ca be inferred.
The vertical and horizontal shifts are returned in pixels-per-unit-translation.
To compute the vertical and horizontal tilt angles from the obtained shift_pix
:
tilt_deg = np.rad2deg(np.arctan(shift_pix_per_unit_translation * pixel_size))
where pixel_size
and and the input parameter img_pos
have to be
expressed in the same units.
API: DetectorTranslationAlongBeam