Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
51f6522
Add notebook with LokiWorkflow
SimonHeybrock Sep 1, 2025
5cdbd72
Assume all banks have the same size
SimonHeybrock Sep 1, 2025
6fb18a7
Merge branch 'main' into loki-detector-test
nvaytet Dec 4, 2025
fab0c58
replace tof providers by using GenericTofWorkflow
nvaytet Dec 4, 2025
bef816f
switch to using GenericTofWorkflow for SANS/loki. We remove the Scatt…
nvaytet Dec 5, 2025
c9d631b
Apply automatic formatting
pre-commit-ci-lite[bot] Dec 5, 2025
34e9e64
update index
nvaytet Dec 5, 2025
1210ea3
cleanup
nvaytet Dec 5, 2025
6335c27
start skeleton for tests
nvaytet Dec 5, 2025
1fa4aee
update essreduce
nvaytet Dec 5, 2025
161b2ac
update deps
nvaytet Dec 5, 2025
d1bff56
replace common with conftest in other test files
nvaytet Dec 6, 2025
39c3495
Merge branch 'main' into loki-detector-test
nvaytet Dec 9, 2025
966c399
set up test for iofq. now we only need a data file made from coda wit…
nvaytet Dec 9, 2025
6d6eac5
typo
nvaytet Dec 9, 2025
ce376c2
use modified coda file with a single event for tests
nvaytet Dec 10, 2025
bac07be
Apply automatic formatting
pre-commit-ci-lite[bot] Dec 10, 2025
eda6a3d
update docs notebook
nvaytet Dec 10, 2025
0f5d084
update docs index
nvaytet Dec 10, 2025
e4bcb11
Merge branch 'loki-detector-test' of github.com:scipp/esssans into lo…
nvaytet Dec 10, 2025
089aa8d
fix tests
nvaytet Dec 10, 2025
0a4aea1
really fix tests
nvaytet Dec 10, 2025
c2d8743
add tof to docs deps
nvaytet Dec 10, 2025
7ee9065
update deps
nvaytet Dec 10, 2025
2df9026
better detector sizes
nvaytet Dec 11, 2025
84936a1
remove commented code
nvaytet Dec 11, 2025
531c25c
update notebooks in docs
nvaytet Dec 11, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions docs/user-guide/loki/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,5 +7,7 @@ maxdepth: 1

loki-direct-beam
loki-iofq
loki-reduction-ess
workflow-widget-loki
loki-make-tof-lookup-table
```
125 changes: 125 additions & 0 deletions docs/user-guide/loki/loki-make-tof-lookup-table.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,125 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "0",
"metadata": {},
"source": [
"# Create a time-of-flight lookup table for LoKI"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "1",
"metadata": {},
"outputs": [],
"source": [
"import scipp as sc\n",
"from ess.reduce import time_of_flight\n",
"from ess.reduce.nexus.types import AnyRun"
]
},
{
"cell_type": "markdown",
"id": "2",
"metadata": {},
"source": [
"## Setting up the workflow\n",
"\n",
"Note here that for now, we have no chopper in the beamline.\n",
"This should be added in the next iteration."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3",
"metadata": {},
"outputs": [],
"source": [
"source_position = sc.vector([0, 0, 0], unit='m')\n",
"\n",
"wf = time_of_flight.TofLookupTableWorkflow()\n",
"wf[time_of_flight.DiskChoppers[AnyRun]] = {}\n",
"wf[time_of_flight.SourcePosition] = source_position\n",
"wf[time_of_flight.NumberOfSimulatedNeutrons] = 200_000 # Increase this number for more reliable results\n",
"wf[time_of_flight.SimulationSeed] = 1234\n",
"wf[time_of_flight.PulseStride] = 1\n",
"wf[time_of_flight.LtotalRange] = sc.scalar(9.0, unit=\"m\"), sc.scalar(35.0, unit=\"m\")\n",
"wf[time_of_flight.DistanceResolution] = sc.scalar(0.1, unit=\"m\")\n",
"wf[time_of_flight.TimeResolution] = sc.scalar(250.0, unit='us')\n",
"wf[time_of_flight.LookupTableRelativeErrorThreshold] = 1.0"
]
},
{
"cell_type": "markdown",
"id": "4",
"metadata": {},
"source": [
"## Compute the table"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5",
"metadata": {},
"outputs": [],
"source": [
"table = wf.compute(time_of_flight.TimeOfFlightLookupTable)\n",
"table"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "6",
"metadata": {},
"outputs": [],
"source": [
"table.plot()"
]
},
{
"cell_type": "markdown",
"id": "7",
"metadata": {},
"source": [
"## Save to file"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "8",
"metadata": {},
"outputs": [],
"source": [
"# Write to file\n",
"table.save_hdf5('loki-tof-lookup-table-no-choppers.h5')"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.7"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
163 changes: 163 additions & 0 deletions docs/user-guide/loki/loki-reduction-ess.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,163 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "0",
"metadata": {},
"source": [
"# Loki workflow\n",
"\n",
"A short experimental notebook to illustrate how to set-up and run the reduction workflow for Loki @ ESS."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "1",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"import scipp as sc\n",
"import ess.loki.data # noqa: F401\n",
"from ess import loki\n",
"from ess.sans.types import *"
]
},
{
"cell_type": "markdown",
"id": "2",
"metadata": {},
"source": [
"## Workflow setup"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3",
"metadata": {},
"outputs": [],
"source": [
"wf = loki.LokiWorkflow()\n",
"\n",
"# Set detector bank name: in this case there is only one bank\n",
"wf[NeXusDetectorName] = \"loki_detector_0\"\n",
"\n",
"# Wavelength and Q binning parameters\n",
"wf[WavelengthBins] = sc.linspace(\"wavelength\", 1.0, 13.0, 201, unit=\"angstrom\")\n",
"wf[QBins] = sc.linspace(dim=\"Q\", start=0.01, stop=0.3, num=101, unit=\"1/angstrom\")\n",
"\n",
"# Other parameters\n",
"wf[CorrectForGravity] = True\n",
"wf[UncertaintyBroadcastMode] = UncertaintyBroadcastMode.upper_bound\n",
"wf[ReturnEvents] = False\n",
"wf[BeamCenter] = sc.vector([0.0, 0.0, 0.0], unit=\"m\")\n",
"wf[DirectBeam] = None\n",
"wf[DetectorMasks] = {}\n",
"wf[TimeOfFlightLookupTableFilename] = loki.data.loki_tof_lookup_table_no_choppers()\n",
"\n",
"# Use a small dummy file for testing.\n",
"# TODO: We currently use the same file for all runs; this should be updated\n",
"# once we have files from an actual run.\n",
"wf[Filename[SampleRun]] = loki.data.loki_coda_file_one_event()\n",
"wf[Filename[EmptyBeamRun]] = loki.data.loki_coda_file_one_event()\n",
"wf[Filename[TransmissionRun[SampleRun]]] = loki.data.loki_coda_file_one_event()\n",
"\n",
"# Visualize the workflow\n",
"wf.visualize(IntensityQ[SampleRun], graph_attr={'rankdir': 'LR'})"
]
},
{
"cell_type": "markdown",
"id": "4",
"metadata": {},
"source": [
"### Compute $I(Q)$\n",
"\n",
"We compute the `IntensityQ` for the sample run.\n",
"\n",
"**Note:** since we are currently using the same file for sample, empty-beam, and transmission runs,\n",
"the final results are meaningless (NaNs in all Q bins). However, this should not prevent the workflow\n",
"from running."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5",
"metadata": {},
"outputs": [],
"source": [
"wf.compute(IntensityQ[SampleRun])"
]
},
{
"cell_type": "markdown",
"id": "6",
"metadata": {},
"source": [
"## Map over detector banks\n",
"\n",
"Loki has 9 detectors banks, and in principle we would want to run the same workflow on all banks\n",
"(treating all pixels in the same way).\n",
"\n",
"To compute a reduced result for all banks, we map the workflow over all bank names:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "7",
"metadata": {},
"outputs": [],
"source": [
"import pandas as pd\n",
"\n",
"bank_ids = list(range(9))\n",
"bank_names = [f'loki_detector_{i}' for i in bank_ids]\n",
"param_table = pd.DataFrame({NeXusDetectorName: bank_names}, index=bank_ids).rename_axis(\n",
" index='bank_id'\n",
")\n",
"param_table"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "8",
"metadata": {},
"outputs": [],
"source": [
"mapped = wf.map(param_table)\n",
"\n",
"results = sciline.compute_mapped(mapped, IntensityQ[SampleRun])\n",
"\n",
"# Convert to a DataGroup for better notebook visualization\n",
"sc.DataGroup({str(k): v for k, v in results.items()})"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.7"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ requires-python = ">=3.11"
dependencies = [
"dask>=2022.1.0",
"graphviz>=0.20",
"essreduce>=25.11.0",
"essreduce>=25.12.1",
"numpy>=1.26.4",
"pandas>=2.1.2",
"plopp>=25.03.0",
Expand Down
2 changes: 1 addition & 1 deletion requirements/base.in
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
# The following was generated by 'tox -e deps', DO NOT EDIT MANUALLY!
dask>=2022.1.0
graphviz>=0.20
essreduce>=25.11.0
essreduce>=25.12.1
numpy>=1.26.4
pandas>=2.1.2
plopp>=25.03.0
Expand Down
Loading
Loading