-
Notifications
You must be signed in to change notification settings - Fork 2
feat: initial beer mcstas workflow #184
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
src/ess/beer/conversions.py
Outdated
def compute_tof_from_known_peaks( | ||
da: DetectorData[RunType], graph: ElasticCoordTransformGraph | ||
) -> DetectorTofData[RunType]: | ||
return da.transform_coords( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is badly named, since it computes dspacing
etc in addition to tof
. That's just how it works right now to get the demo notebook running without too much effort. I would like to connect to the regular powder diffraction workflow after this point.
'tof': _tof_from_dhkl, | ||
'coarse_dhkl': _compute_d, | ||
'theta': lambda two_theta: two_theta / 2, | ||
'dhkl_list': lambda: dhkl_list, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jl-wynen in chat you mentioned it's better to generate a new _compute_d
function binding dhkl_list
in its scope instead of having dhkl_list
be an "intermediate" here and then use keep_intermediate=False
. The plan is to do that eventually
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The notebook in its current shape is mainly there to be able to display the current status of the implementation to Celine. When we know more it will be replace by a more useful tutorial notebook.
src/ess/beer/clustering.py
Outdated
da.coords['coarse_d'] = ( | ||
constants.h | ||
/ constants.m_n | ||
* (da.coords['event_time_offset'] - da.coords['approximate_t0']) | ||
/ sc.sin(da.coords['two_theta'] / 2) | ||
/ da.coords['L0'] | ||
/ 2 | ||
).to(unit='angstrom') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bit low-level, but since I noticed this: Can you use https://github.com/scipp/scippneutron/blob/5acb18bc675eba373fae94ea73012d599e1e1f49/src/scippneutron/conversion/tof.py#L61 instead of reimplementing it here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes we can
src/ess/beer/workflow.py
Outdated
0.5107, | ||
0.5107, | ||
0.5102, | ||
0.5057, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Where do those number come from and what sample do they apply to? And why are they hardcoded in the package?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They are peak positions for a particular test sample. They should not be hardcoded in the package
"wf = BeerMcStasWorkflowKnownPeaks()\n", | ||
"wf[Filename[SampleRun]] = mcstas_duplex(9)\n", | ||
"da = wf.compute(DetectorTofData[SampleRun])\n", | ||
"da.hist(dspacing=dspacing).plot()" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why does this data have a dspacing coord? It should be in tof, not dspacing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes I agree, it should just have tof.
"wf = BeerMcStasWorkflow()\n", | ||
"wf[Filename[SampleRun]] = mcstas_duplex(9)\n", | ||
"da = wf.compute(DetectorTofData[SampleRun])\n", | ||
"da.bins.coords['dspacing'] = (sc.constants.h / sc.constants.m_n * da.bins.coords['tof'] / sc.sin(da.bins.coords['two_theta'] / 2) / da.bins.coords['Ltotal'] / 2).to(unit='angstrom')\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This one doesn't have dspacing? Then it should be a different type.
Fixes #180 #181
Initial data reduction workflow for BEER McStas data files.
I've tried to keep it pretty simple and flexible for now.
I'm not sure where this should be "connected" to the regular diffraction workflow, so for now it's a separate workflow.
For now this is a draft because the McStas data loader needs to load chopper information better to be able to properly handle different chopper "modes".
Comments and suggestions are welcome.