Collimated data reduction for OFFSPEC#
This notebook implements a reduction workflow for reflectometry data collected from the ISIS instrument OFFSPEC using a collimated beam. This workflow implements the same procedure as the corresponding workflow in Mantid, see https://docs.mantidproject.org/nightly/techniques/ISIS_Reflectometry.html.
[1]:
%matplotlib widget
from datetime import datetime
import platform
import scipp as sc
from orsopy import fileio
from ess import reflectometry, offspec
from ess.reflectometry.types import *
from ess.offspec.types import *
Loading some data#
In this example, we load some test data provided by the offspec
package. We need a sample measurement (the sample is Air | Si(790 A) | Cu(300 A) | SiO2
) and a direct beam measurement. The latter was obtained by positioning the detector directly in the beam of incident neutrons and moving the sample out of the way. It gives an estimate for the ISIS pulse structure as a function of time-of-flight.
[2]:
wf = offspec.OffspecWorkflow()
wf[Filename[SampleRun]] = offspec.data.offspec_sample_run()
wf[Filename[ReferenceRun]] = offspec.data.offspec_direct_beam_run()
Downloading file 'sample.h5' from 'https://public.esss.dk/groups/scipp/ess/offspec/1/sample.h5' to '/home/runner/.cache/ess/offspec/1'.
Downloading file 'direct_beam.h5' from 'https://public.esss.dk/groups/scipp/ess/offspec/1/direct_beam.h5' to '/home/runner/.cache/ess/offspec/1'.
[3]:
wf.visualize(ReflectivityOverQ, graph_attr={'rankdir': 'LR'})
[3]:
Populating the ORSO header#
We will write the reduced data file following the ORSO .ort
` standard <https://www.reflectometry.org/file_format/specification>`__, to enable a metadata rich header. We will create an empty header and then populate this.
The data source information#
[4]:
header = fileio.orso.Orso.empty()
header.data_source.owner = fileio.base.Person(
name="Joshanial F. K. Cooper",
affiliation="ISIS Neutron and Muon Source",
)
header.data_source.experiment = fileio.data_source.Experiment(
title="OFFSPEC Sample Data",
instrument="OFFSPEC",
start_date="2020-12-14T10:34:02",
probe="neutron",
facility="RAL/ISIS/OFFSPEC",
)
header.data_source.sample = fileio.data_source.Sample(
name="QCS sample",
category="gas/solid",
composition="Air | Si(790 A) | Cu(300 A) | SiO2",
)
header.data_source.measurement = fileio.data_source.Measurement(
instrument_settings=fileio.data_source.InstrumentSettings(
incident_angle=fileio.base.Value(
wf.compute(RawDetectorData[SampleRun]).coords["theta"].value,
wf.compute(RawDetectorData[SampleRun]).coords["theta"].unit
),
wavelength=None,
polarization="unpolarized",
),
data_files=[
offspec.data.offspec_sample_run().rsplit("/", 1)[-1],
offspec.data.offspec_direct_beam_run().rsplit("/", 1)[-1],
],
scheme="energy-dispersive",
)
The reduction details#
The reduction
section can start to be populated also. Entries such as corrections
will be filled up through the reduction process.
[5]:
header.reduction.software = fileio.reduction.Software(
name="essreflectometry", version=reflectometry.__version__, platform=platform.platform()
)
header.reduction.timestamp = datetime.now() # noqa: DTZ005
header.reduction.creator = fileio.base.Person(
name="I. D. Scientist",
affiliation="European Spallation Source",
contact="i.d.scientist@ess.eu",
)
header.reduction.corrections = []
header.reduction.computer = platform.node()
header.reduction.script = "offspec_reduction.ipynb"
To ensure that the header object is carried through the process, we assign it to the sample scipp.DataArray
. The direct beam header object will be overwritten at the normalisation step so we will keep this empty.
Determining the region of interest#
To determine what region of the detector contains the specular peak intensity we plot the intensity distribution of the sample measurement over spectrum
(detector pixel) and time-of-flight
.
[6]:
wf.compute(RawDetectorData[SampleRun]).hist(tof=50).plot(norm='log') \
+ wf.compute(RawDetectorData[ReferenceRun]).hist(tof=50).plot(norm='log')
[6]:
The region of interest is set in the workflow by setting SpectrumLimits
. In this case it seems the specular peak is in the region [389, 414]
.
[7]:
wf[SpectrumLimits] = (sc.scalar(389, unit=None), sc.scalar(414, unit=None))
header.reduction.corrections += ['region of interest defined as spectrum 389:415']
Coordinate transform graph#
To compute the wavelength \(\lambda\) we can use a coordinate transform graph. The OFFSPEC graph is the standard reflectometry graph, shown below.
[8]:
sc.show_graph(wf.compute(CoordTransformationGraph[SampleRun]), simplified=True)
[8]:
Since the direct beam measurement is not a reflectometry measurement, we use the no_scatter_graph
to convert this to wavelength.
[9]:
sc.show_graph(wf.compute(CoordTransformationGraph[ReferenceRun]), simplified=True)
[9]:
Normalization by monitor#
It is necessary to normalize the sample and direct beam measurements by the summed monitor counts, which accounts for different lengths of measurement and long-timescale natural variation in the pulse. This will ensure that the final data has the correct scaling when the reflectivity data is normalized. First, we convert the data to wavelength, using the no_scatter_graph
used previously for the direct beam.
The most reliable monitor for the OFFSPEC instrument is ‘monitor2’ in the file, therefore this is used.
[10]:
wf.compute(MonitorData[SampleRun]).plot()
[10]:
A background subtraction is then performed on the monitor data, where the background is taken as any counts at wavelengths greater than 15 Å. We also mask all events in the sample- and direct-beam measurements that fall outside of the wavelength range we expect for the instrument.
[11]:
wf[BackgroundMinWavelength] = sc.scalar(15, unit='angstrom')
wf[WavelengthBins] = sc.linspace(dim='wavelength', start=2, stop=14, num=2, unit='angstrom')
header.reduction.corrections += ['monitor background subtraction, background above 15 Å']
Normalisation of sample by direct beam#
The sample and direct beam measurements (which have been normalised by monitor counts) are then histogrammed in \(Q\) to 100 geometrically spaced points. The histogrammed direct beam is then used to normalised the sample.
Importantly, some relevant metadata (including the ORSO header object) is carried over.
[12]:
wf[QBins] = sc.geomspace('Q', 0.005, 0.033, 101, unit='1/angstrom')
header.reduction.corrections += ["normalisation by direct beam"]
We will assume a 3 % of \(Q\) resolution function to be included in our file.
[13]:
wf[QResolution] = 0.03
Conversion to \(Q\)#
This normalised data can then be used to compute the reflectivity as a function of the scattering vector \(Q\).
[14]:
Roq = wf.compute(ReflectivityOverQ).hist()
Roq.plot(norm='log')
[14]:
Saving the scipp-reduced data as .ort#
We constructed the ORSO header through the reduction process. We can now make use of this when we save our .ort file.
And it is necessary to add the column for our uncertainties, which details the meaning of the uncertainty values we have given.
[15]:
header.columns.append(fileio.base.ErrorColumn(error_of='R', error_type='uncertainty', value_is='sigma'))
header.columns.append(fileio.base.ErrorColumn(error_of='Q', error_type='resolution', value_is='sigma'))
Finally, we can save the file.
[16]:
import numpy as np
ds = fileio.orso.OrsoDataset(
header,
np.array([
sc.midpoints(Roq.coords['Q']).values,
Roq.data.values,
sc.stddevs(Roq.data).values,
Roq.coords['Q_resolution'].values]
).T
)
fileio.save_orso([ds], 'offspec.ort')
[17]:
!head -n 50 offspec.ort
# # ORSO reflectivity data file | 1.1 standard | YAML encoding | https://www.reflectometry.org/
# data_source:
# owner:
# name: Joshanial F. K. Cooper
# affiliation: ISIS Neutron and Muon Source
# experiment:
# title: OFFSPEC Sample Data
# instrument: OFFSPEC
# start_date: 2020-12-14T10:34:02
# probe: neutron
# facility: RAL/ISIS/OFFSPEC
# sample:
# name: QCS sample
# category: gas/solid
# composition: Air | Si(790 A) | Cu(300 A) | SiO2
# measurement:
# instrument_settings:
# incident_angle: {magnitude: 0.299933522939682, unit: deg}
# wavelength: null
# polarization: unpolarized
# data_files:
# - sample.h5
# - direct_beam.h5
# scheme: energy-dispersive
# reduction:
# software: {name: essreflectometry, version: 25.5.0, platform: Linux-6.11.0-1012-azure-x86_64-with-glibc2.39}
# timestamp: 2025-04-30T15:18:45.419280
# creator:
# name: I. D. Scientist
# affiliation: European Spallation Source
# contact: i.d.scientist@ess.eu
# corrections:
# - region of interest defined as spectrum 389:415
# - "monitor background subtraction, background above 15 \xC5"
# - normalisation by direct beam
# computer: fv-az1787-568
# script: offspec_reduction.ipynb
# data_set: 0
# columns:
# - {name: Qz, unit: 1/angstrom}
# - {name: R}
# - {error_of: R, error_type: uncertainty, value_is: sigma}
# - {error_of: Q, error_type: resolution, value_is: sigma}
# # Qz (1/angstrom) R sR sQ
5.0476246834323226e-03 5.4335807606729425e-01 4.4815418783401194e-02 1.5146230885454453e-04
5.1437812944857761e-03 6.6198996305950430e-01 4.6809759195680659e-02 1.5433817019431845e-04
5.2417696768037696e-03 6.3655329251146431e-01 4.3513887256205225e-02 1.5726292483764080e-04
5.3416247253969375e-03 6.4456643716187512e-01 4.0364316016897377e-02 1.6028721377817236e-04
5.4433820000214534e-03 6.3510973660558190e-01 3.8368382357253872e-02 1.6332953068035961e-04
5.5470777378423403e-03 6.6261091612434919e-01 3.8066002527274163e-02 1.6646133258810508e-04
[18]:
header.columns
[18]:
[Column(name='Qz', unit='1/angstrom', physical_quantity=None, flag_is=None, comment=None),
Column(name='R', unit=None, physical_quantity=None, flag_is=None, comment=None),
ErrorColumn(error_of='R', error_type='uncertainty', value_is='sigma', distribution=None, comment=None),
ErrorColumn(error_of='Q', error_type='resolution', value_is='sigma', distribution=None, comment=None)]