In this part, we see how to
We are going to use three z-steps, out of $44$ z-steps which are performed on a sheed head.
The instructions below converts bliss scans found at directory /scisoft/tomo_training/helical/sheep/ corresponding to sample named HA-1100_27.73um_sheep-head_ethanol_W_, converting only from stage 29 to stage 31 and interpolating the reference scans. The symbol \ continues the lines. You can copy paste the lines and run them in the linux shell.
First we define some auxiliary variable to manage writing down the long paths names at which we take the data :
ROOT=/scisoft/tomo_training/helical/sheep/
SAMPLE=HA-1100_27.73um_sheep-head_ethanol_W_
BEGIN=REF_B_0000
END=REF_E_0000
Then we compose the paths, using the auxiliary variables. The first three lines are for the data sources. The other three lines, instead are for the nxtomomilled nexus files:
filename_template=${ROOT}/${SAMPLE}_XXXX//${SAMPLE}_XXXX_0001//${SAMPLE}_XXXX_0001.h5
ref_scan_begin=${ROOT}/${SAMPLE}${BEGIN}//${SAMPLE}${BEGIN}_0001//${SAMPLE}${BEGIN}_0001.h5
ref_scan_end=${ROOT}/${SAMPLE}${END}//${SAMPLE}${END}_0001//${SAMPLE}${END}_0001.h5
The above data sources are needed to generate the nexus files. They indicate the name of the scans and the two reference scans, one that has been acquired at the beginning, the other at the end of the series. The following three lines define the nexus names, for future reference when we will us the converted results.
nexus_name_template=${SAMPLE}_XXXX_0001.nx
nexus_ref_scan_begin=${SAMPLE}REF_B_0000_0001.nx
nexus_ref_scan_end=${SAMPLE}REF_E_0000_0001.nx
we can now pass the parameters to the nxtomomill command to batch convert all the bliss needed scans into the nexus datasets that we will use later:
nxtomomill zstages2nxs \
--filename_template $filename_template \
--entry_name entry0000 \
--total_nstages 44 \
--first_stage 29 \
--last_stage 31 \
--do_references True \
--ref_scan_begin $ref_scan_begin \
--ref_scan_end $ref_scan_end \
--target_directory ./
It takes the time to read the reference scans (many radios) to convert them by averaging in the reference flat.
Now that we have all what we need in nexus format we go ahead with the determination of the centers of rotation. This information is used in the step following this one.
We use the composite algorithm with the near option. This creates the file cors.tx with the center of rotations of stages from 29 to 31.
nabu-composite-cor \
--filename_template $nexus_name_template \
--entry_name entry0000 \
--num_of_stages 3 \
--first_stage 29 \
--output_file ${SAMPLE}_cors.txt \
--cor_options "side='near'; near_pos = 1100.0; near_width = 20.0"
This nxtomomill will generated one nexus helical dataset, with associated flats/darks which are created according to the scan_before and scan_after, by interpolation.
nxtomomill z-concatenate-scans \
--filename_template $nexus_name_template \
--target_file ${SAMPLE}.nx \
--entry_name entry0000 \
--total_nstages 44 \
--first_stage 29 \
--last_stage 31 \
--cors_file ${SAMPLE}_cors.txt \
--pixel_size_m 0.00002737 \
--flats_from_before_after yes \
--scan_before $nexus_ref_scan_begin \
--scan_after $nexus_ref_scan_end
the cors position are used to set the x-translation in the final the helical scan ${SAMPLE}.nx
To create a configuration file from scratch, the command-line tool nabu-config
can be used:
nabu-config --helical 1 --output ${SAMPLE}.conf --dataset ${SAMPLE}.nx
this creates a file HA-1100_27.73um_sheep-head_ethanol_W_.conf
(nabu.conf
would be the default) with pre-filled dataset location ( defaults to an empty slot). To see all the parameters add --level advanced
while for a minimalistic version use instead --level required
The important keys, specific to helical case, are discussed at this link. To finalise the configuration file you need to
set the Paganin options
set the processes_file
parameter for weights map and double flat
set rotation_axis_position
specify the output file/directory in the [output] section
select the reconstruction range either:
You can let, for start_z, end_z
the default parameters (0,-1) and the whole doable span be reconstructed.
Alternatively you can play with --dry_run 1
and, based on the ontained informations, select a slice or a subregion.
--max_chun_size
option to see if there is an impact on the total reconstruction time due to the efficiency of hdf5 readingLook at the results that you can obtain with the configuration file obtained by the above procedure. There are strong distortions which come from the optics. This will be particularly evident if you look at the teeths as they should have sharp features, which instead are now still blurred.
nabu-poly2map --nx 3104 \
--nz 256 \
--center_z 128.0 \
--center_x 1750 \
--c4 0.0017241379310344827 \
--c2 0.0021379310344827587 \
--axis_pos 2652.125 \
--target_file dm.h5
and the following lines can be added to the configuration file.
[preproc]
...
detector_distortion_correction=map_xz
detector_distortion_correction_options=map_x="silx:./dm.h5?path=/coords_source_x"; map_z="silx:./dm.h5?path=/coords_source_z"
Moreover the nabu-poly2map script outputs the modified axis according to the correction map and the input parameter. This can be reused to correct the axis in the configuration file.
The nabu-poly2map application builds two arrays. Let us call them map_x and map_z. Both are 2D arrays with shape given by (nz, nx)
. These maps are meant to be used to generate a corrected detector image, using them to
obtain the pixel (i,j)
of the corrected image by interpolating the raw data at position ( map_z(i,j), map_x(i,j) )
.
nabu-poly2map
command¶This map is determined by a user given polynomial P(rs)
in the radial variable rs = sqrt( (z-center_z)**2 + (x-center_x)**2 ) / (nx/2)
where center_z
and center_x
give the center around which the deformation is centered. The perfect position (zp,xp)
, that would be observed on a perfect detector,
of a photon observed at pixel (z,x)
of the distorted detector is: (zp, xp) = (center_z, center_x) + P(rs) * ( z - center_z , x - center_x )
The polynomial is given by P(rs) = rs *(1 + c2 * rs**2 + c4 * rs**4)
The map is rescaled and reshifted so that a perfect match is realised at the borders of a horizontal line passing by the center. This ensures coerence with the procedure of pixel size calibration which is
performed moving a needle horizontally and reading the motor positions at the extreme positions. The maps are written in the target file, creating it as hdf5 file, in the datasets "/coords_source_x"
"/coords_source_z".