Workflow#
In this example, we will use McStas 3 simulation file.
Build Pipeline (Collect Parameters and Providers)#
Import the providers from load_mcstas_nexus
to use the McStas
simulation data workflow. MaximumProbability
can be manually provided to derive more realistic number of events. It is because weights
are given as probability, not number of events in a McStas file.
[1]:
from ess.nmx.mcstas import McStasWorkflow
from ess.nmx.data import small_mcstas_3_sample
from ess.nmx.types import *
from ess.nmx.reduction import merge_panels
from ess.nmx.nexus import export_as_nexus
wf = McStasWorkflow()
# Replace with the path to your own file
wf[FilePath] = small_mcstas_3_sample()
wf[MaximumCounts] = 10000
wf[TimeBinSteps] = 50
Downloading file 'small_mcstas_3_sample.h5' from 'https://public.esss.dk/groups/scipp/ess/nmx/small_mcstas_3_sample.h5' to '/home/runner/.cache/essnmx/0'.
To see what the workflow can produce, display it:
[2]:
wf
[2]:
Name | Value | Source |
---|---|---|
CrystalRotation |
load_crystal_rotationess.nmx.mcstas.load.load_crystal_rotation | |
DetectorBankPrefix |
load_event_data_bank_nameess.nmx.mcstas.load.load_event_data_bank_name | |
DetectorIndex | ||
DetectorName |
detector_name_from_indexess.nmx.mcstas.load.detector_name_from_index | |
FilePath |
/home/runner/.cache/essnmx/0/s.../home/runner/.cache/essnmx/0/small_mcstas_3_sample.h5 |
|
MaximumCounts | 10000 | |
MaximumProbability |
maximum_probabilityess.nmx.mcstas.load.maximum_probability | |
MaximumTimeOfArrival |
calculate_maximum_toaess.nmx.reduction.calculate_maximum_toa | |
McStasInstrument |
read_mcstas_geometry_xmless.nmx.mcstas.xml.read_mcstas_geometry_xml | |
McStasWeight2CountScaleFactor |
mcstas_weight_to_probability_scalefactoress.nmx.mcstas.load.mcstas_weight_to_probability_scalefactor | |
MinimumTimeOfArrival |
calculate_minimum_toaess.nmx.reduction.calculate_minimum_toa | |
NMXDetectorMetadata |
load_detector_metadataess.nmx.mcstas.load.load_detector_metadata | |
NMXExperimentMetadata |
load_experiment_metadataess.nmx.mcstas.load.load_experiment_metadata | |
NMXRawDataMetadata |
retrieve_raw_data_metadataess.nmx.mcstas.load.retrieve_raw_data_metadata | |
NMXRawEventCountsDataGroup |
load_mcstasess.nmx.mcstas.load.load_mcstas | |
NMXReducedCounts |
raw_event_probability_to_countsess.nmx.reduction.raw_event_probability_to_counts | |
NMXReducedDataGroup |
format_nmx_reduced_dataess.nmx.reduction.format_nmx_reduced_data | |
NMXReducedProbability |
reduce_raw_event_probabilityess.nmx.reduction.reduce_raw_event_probability | |
PixelIds |
retrieve_pixel_idsess.nmx.mcstas.load.retrieve_pixel_ids | |
ProtonCharge |
proton_charge_from_event_countsess.nmx.reduction.proton_charge_from_event_counts | |
RawEventProbability |
load_raw_event_dataess.nmx.mcstas.load.load_raw_event_data | |
TimeBinSteps | 50 |
We want to reduce all three panels, so we map the relevant part of the workflow over a list of the three panels:
[3]:
# DetectorIndex selects what detector panels to include in the run
# in this case we select all three panels.
wf[NMXReducedDataGroup] = (
wf[NMXReducedDataGroup]
.map({DetectorIndex: sc.arange('panel', 3, unit=None)})
.reduce(index="panel", func=merge_panels)
)
Build Workflow#
[4]:
wf.visualize(NMXReducedDataGroup, graph_attr={"rankdir": "TD"}, compact=True)
[4]:
Compute Desired Types#
[5]:
from cyclebane.graph import NodeName, IndexValues
# Data from all selected detectors binned by panel, pixel and timeslice
targets = [NodeName(NMXReducedDataGroup, IndexValues(("panel",), (i,))) for i in range(3)]
dg = merge_panels(*wf.compute(targets).values())
dg
[5]:
- countsscippDataArray(panel: 3, id: 1638400, t: 50)float64counts0.0, 0.0, ..., 0.0, 0.0
- proton_chargescippVariable(panel: 3)float64counts10.506, 9.440, 9.867
- crystal_rotationscippVariable()vector3deg[0. 0. 0.]
- sample_positionscippVariable()vector3m[0. 0. 0.]
- source_positionscippVariable()vector3m[ -0.53123 0. -157.405 ]
- sample_namescippVariable()stringsampleMantid
- fast_axisscippVariable(panel: 3)vector3𝟙[ 0.999986 0. -0.00529148], [-0.00531614 0. -0.99998587], [0.00531614 0. 0.99998587]
- slow_axisscippVariable()vector3𝟙[0. 1. 0.]
- origin_positionscippVariable(panel: 3)vector3m[-0.248454 -0.25 0.292 ], [-0.288666 -0.25 0.252 ], [ 0.288667 -0.25 -0.251 ]
- positionscippVariable(panel: 3, id: 1638400)vector3m[-0.248454 -0.25 0.292 ], [-0.24805401 -0.25 0.29199788], ..., [0.29138461 0.2616 0.26019278], [0.29138674 0.2616 0.26059277]
- detector_shapescippVariable()PyObject(1280, 1280)
- x_pixel_sizescippVariable()float64m0.0004
- y_pixel_sizescippVariable()float64m0.0004
- detector_namescippVariable(panel: 3)stringnD_Mantid_0, nD_Mantid_1, nD_Mantid_2
[6]:
dg['counts']
[6]:
- panel: 3
- id: 1638400
- t: 50
- id(panel, id)int641, 2, ..., 5638398, 5638399
Values:
array([[ 1, 2, 3, ..., 1638398, 1638399, 1638400], [2000000, 2000001, 2000002, ..., 3638397, 3638398, 3638399], [4000000, 4000001, 4000002, ..., 5638397, 5638398, 5638399]], shape=(3, 1638400)) - t(panel, t [bin-edge])float64s0.097, 0.098, ..., 0.144, 0.145
Values:
array([[0.09725517, 0.09816732, 0.09907947, 0.09999161, 0.10090376, 0.10181591, 0.10272805, 0.1036402 , 0.10455235, 0.1054645 , 0.10637664, 0.10728879, 0.10820094, 0.10911308, 0.11002523, 0.11093738, 0.11184953, 0.11276167, 0.11367382, 0.11458597, 0.11549811, 0.11641026, 0.11732241, 0.11823456, 0.1191467 , 0.12005885, 0.120971 , 0.12188314, 0.12279529, 0.12370744, 0.12461959, 0.12553173, 0.12644388, 0.12735603, 0.12826817, 0.12918032, 0.13009247, 0.13100462, 0.13191676, 0.13282891, 0.13374106, 0.13465321, 0.13556535, 0.1364775 , 0.13738965, 0.13830179, 0.13921394, 0.14012609, 0.14103824, 0.14195038, 0.14286253], [0.09934708, 0.10022566, 0.10110424, 0.10198283, 0.10286141, 0.10373999, 0.10461857, 0.10549715, 0.10637573, 0.10725431, 0.10813289, 0.10901147, 0.10989006, 0.11076864, 0.11164722, 0.1125258 , 0.11340438, 0.11428296, 0.11516154, 0.11604012, 0.11691871, 0.11779729, 0.11867587, 0.11955445, 0.12043303, 0.12131161, 0.12219019, 0.12306877, 0.12394736, 0.12482594, 0.12570452, 0.1265831 , 0.12746168, 0.12834026, 0.12921884, 0.13009742, 0.13097601, 0.13185459, 0.13273317, 0.13361175, 0.13449033, 0.13536891, 0.13624749, 0.13712607, 0.13800465, 0.13888324, 0.13976182, 0.1406404 , 0.14151898, 0.14239756, 0.14327614], [0.09933721, 0.10025598, 0.10117475, 0.10209353, 0.1030123 , 0.10393107, 0.10484984, 0.10576862, 0.10668739, 0.10760616, 0.10852493, 0.10944371, 0.11036248, 0.11128125, 0.11220002, 0.1131188 , 0.11403757, 0.11495634, 0.11587511, 0.11679389, 0.11771266, 0.11863143, 0.1195502 , 0.12046898, 0.12138775, 0.12230652, 0.12322529, 0.12414406, 0.12506284, 0.12598161, 0.12690038, 0.12781915, 0.12873793, 0.1296567 , 0.13057547, 0.13149424, 0.13241302, 0.13333179, 0.13425056, 0.13516933, 0.13608811, 0.13700688, 0.13792565, 0.13884442, 0.1397632 , 0.14068197, 0.14160074, 0.14251951, 0.14343829, 0.14435706, 0.14527583]])
- (panel, id, t)float64counts0.0, 0.0, ..., 0.0, 0.0
Values:
array([[[0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], ..., [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.]], [[0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], ..., [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.]], [[0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], ..., [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.]]], shape=(3, 1638400, 50))
Export Results#
NMXReducedData
object has a method to export the data into nexus or h5 file.
You can save the result as test.nxs
, for example:
[7]:
export_as_nexus(dg, "test.nxs")
/tmp/ipykernel_3157/608274272.py:1: DeprecationWarning: Exporting to custom NeXus format will be deprecated in the near future.Please use ``export_as_nxlauetof`` instead.
export_as_nexus(dg, "test.nxs")
Instrument View#
Pixel positions are not used for later steps, but it is included in the coordinates for instrument view.
All pixel positions are relative to the sample position, therefore the sample is at (0, 0, 0).
It might be very slow or not work in the ``VS Code`` jupyter notebook editor.
[8]:
import scippneutron as scn
da = dg["counts"]
da.coords["position"] = dg["position"]
# Plot one out of 100 pixels to reduce size of docs output
view = scn.instrument_view(da["id", ::100].sum('t'), pixel_size=0.0075)
view
[8]: