-
Notifications
You must be signed in to change notification settings - Fork 333
Fix #708 Check fastmath flags #1068
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1068 +/- ##
==========================================
- Coverage 97.31% 96.62% -0.70%
==========================================
Files 93 93
Lines 15239 15376 +137
==========================================
+ Hits 14830 14857 +27
- Misses 409 519 +110 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
After going through
Any value less than 1.0 will be 0.0, and any value above 1.0 will be Maybe we open another issue for this case, known as Chebyshev distance. Then, we can think if we should add support for it. The distance can be computed via rolling-max approach. Not sure if it can be added to the current code base without hurting the design. For now, we should not consider |
@NimaSarajpoor I had noticed it as well when I looked the other day. I agree, we should not allow |
Created the issue #1071 |
@NimaSarajpoor Is this ready to be merged? |
Not yet. I am trying to get chain of caller-callees. I have a script that can give a dictionary with key as Code
|
At this point, I don't anticipate allowing any exceptions. Since this is something tedious, I would prefer to automate it and add it to One thing to consider is that |
👍
Right. This should be placed in
Currently I am not trying to jump between modules. What I do is that I collect one-level-deep callees of ALL stumpy functions. that's all I need to create chain for a given caller. However, if I can find a tool that can jump from one module to different module, then finding chain should become easier. Going to look for it. |
I have some ideas and will be able to share them soon |
@NimaSarajpoor While quite verbose, I believe that this will work nicely to generate a list of njit call stacks that is able to jump ACROSS modules:
The output should be:
One immediate observation is that our njit call stacks are very, very flat/shallow, which is a GREAT thing! In my head, I was dreading to find that we have very deeply nested call stacks and so this is a pleasant surprise. Please verify that we haven't missed any edge cases. Hopefully, you are able to take these call stacks and check the |
Thanks for sharing it!! I will go through it. I can compare its output with the one I get when I use my own script (a script that I used locally on top of the script I shared in my previous comment to get caller-callee chains). And then I will work on checking the fastmath flags.
YES! I noticed it too when I obtained the caller-callee chains. 👍 |
Using the code provided in #1068 (comment), I was able to detect call stacks where the number of "fastmath flags" variants is more than one.
Running the code in the branch
I will add the code to In addition, I will use the code in #1025 locally to check the output of |
I used the ugly code in #1025 to see if it can provide a njit caller-callee chain that is not in the output of
Note that this is just one-level deep, meaning it ignores the callee(s) of a callee. I will find and fix the root cause of such discrepancy. |
I think I've spotted a small bug in #1068 (comment). In the method
If you delete the offending
I hope that helps. |
Removing the line
Regarding the first two cases, I noticed that the code now considers the module Potential solution
@seanlaw I think I've found the answer to my question:
|
Yeah, pushing and popping should be a paired action so as long as we keep track of them correctly (i.e., without mis-counting) then I think that is the right thing to do. |
Instead of calling it
|
@NimaSarajpoor How are things going here? I have two questions:
|
No particular concern from my side. The script you shared helped a lot!!
Noticed no missing call stacks. Will check again and provide an update if I notice otherwise.
Not really. There were only two changes:
Good point. I checked and there are currently 7 functions with
I checked the first three cases, and noticed (1) The y-value is a ratio, which is "the running time when (2) In all cases: (3) Regarding I used the following code:
|
Please let me know if this is ready to be merged |
According to the results provided in my previous comment, I think it should be okay to change |
Not really. However, wouldn't we need to remove a lot of unused |
Github Actions raised this error:
A similar error was previously reported in #1061 for the same test function. Going to add the error above to that issue. I think it should be okay to re-run the Github Actions here. @seanlaw ? |
[Update] |
Thank you @NimaSarajpoor for getting this over the finish line! |
@seanlaw |
This PR is to fix #708. An initial inspection was done by @seanlaw in this comment. I am copying that list here for transparency and better tracking.
aamp._compute_diagonal
-P
,PL
,PR
containsnp.inf
andp
can benp.inf
aamp._aamp
-P
,PL
,PR
containsnp.inf
core._sliding_dot_product
- Should be okaycore._calculate_squared_distance_profile
-Should be okay(Returned value inD_squared
might benp.inf
but no arithmetic operation)core.calculate_distance_profile
-Should be okay(Returned value inD_squared
might benp.inf
)core._p_norm_distance_profile
- Should be okay (canp
benp.inf
? not supported. See: Add support for p=np.inf for non-normalized p-norm distance #1071 )core._mass
Should be okaycore._apply_exclusion_zone
-val
containsnp.inf
core._count_diagonal_ndist
- Should be okaycore._get_array_ranges
- Should be okaycore._get_ranges
- Should be okaycore._total_diagonal_ndists
- Should be okayfastmath._add_assoc
- Should be okaymaamp._compute_multi_p_norm
-p
could possibly benp.inf
(not supported. See: Add support for p=np.inf for non-normalized p-norm distance #1071).p_norm
array can containnp.inf
mstump._compute_multi_D
- Might be okay??scraamp._compute_PI
-P_NORM
containsnp.inf
scraamp._prescraamp
-P_NORM
containsnp.inf
scrump._compute_PI
- referencesnp.inf
values, so likely badscrump._prescrump
-P_squared
isnp.inf
stump._compute_diagonal
-ρ
,ρL
, andρR
containnp.inf
stump._stump
-ρ
,ρL
, andρR
containnp.inf