Description
Is your feature request related to a problem?
Coming back to xarray, and using it based on what I remember from a year ago or so, means I make lots of mistakes. I've also been using it outside of a repl, where error messages are more important, given I can't explore a dataset inline.
Some of the error messages could be much more helpful. Take one example:
xarray.core.merge.MergeError: conflicting values for variable 'date' on objects to be combined.
You can skip this check by specifying compat='override'.
The second sentence is nice. But the first could be give us much more information:
- Which variables conflict? I'm merging four objects, so would be so helpful to know which are causing the issue.
- What is the conflict? Is one a superset and I can
join=...
? Are they off by 1 or are they completely different types?- Our
testing.assert_equal
produces pretty nice errors, as a comparison
- Our
Having these good is really useful, lets folks stay in the flow while they're working, and it signals that we're a well-built, refined library.
Describe the solution you'd like
I'm not sure the best way to surface the issues — error messages make for less legible contributions than features or bug fixes, and the primary audience for good error messages is often the opposite of those actively developing the library. They're also more difficult to manage as GH issues — there could be scores of marginal issues which would often be out of date.
One thing we do in PRQL is have a file that snapshots error messages test_bad_error_messages.rs
, which can then be a nice contribution to change those from bad to good. I'm not sure whether that would work here (python doesn't seem to have a great snapshotter, pytest-regtest
is the best I've found; I wrote pytest-accept
but requires doctests).
Any other ideas?
Describe alternatives you've considered
No response
Additional context
A couple of specific error-message issues: