Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix bug with function name replacement #23

Merged
merged 1 commit into from
Feb 6, 2024
Merged

Conversation

kit1980
Copy link
Contributor

@kit1980 kit1980 commented Feb 5, 2024

Before this PR, TorchFix in standalone mode crashes on

from torch import ger
just_name = ger(pos_seq, inv_freq)

With this PR it works fine, replacing the code with

from torch import outer, ger
just_name = outer(pos_seq, inv_freq)

Note that the replacement code still imports no longer needed ger, but that can be caught and fixed by existing linters like ruff.
Nevertheless, I'll create an issue to clean this up later in TorchFix.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Feb 5, 2024
@kit1980 kit1980 marked this pull request as ready for review February 5, 2024 23:48
@kit1980 kit1980 requested review from huydhn and malfet February 5, 2024 23:49
@kit1980
Copy link
Contributor Author

kit1980 commented Feb 5, 2024

Issue to clean up uneeded imports: #24

Copy link
Contributor

@izaitsevfb izaitsevfb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

looks like it should work correctly with nested qualified names, but a test for that may be warranted.

@kit1980
Copy link
Contributor Author

kit1980 commented Feb 6, 2024

LGTM

looks like it should work correctly with nested qualified names, but a test for that may be warranted.

What do you mean exactly, could you give an example?

@kit1980 kit1980 merged commit 428b1e1 into main Feb 6, 2024
2 checks passed
@kit1980 kit1980 deleted the sdym/name-replacement-bug branch February 6, 2024 20:37
@izaitsevfb
Copy link
Contributor

LGTM
looks like it should work correctly with nested qualified names, but a test for that may be warranted.

What do you mean exactly, could you give an example?

Something like this:

import torch

tp_mesh = torch.distributed.device_mesh.init_device_mesh("cuda", (8,))

@kit1980
Copy link
Contributor Author

kit1980 commented Feb 6, 2024

Something like this:

import torch

tp_mesh = torch.distributed.device_mesh.init_device_mesh("cuda", (8,))

I don't understand. init_device_mesh is not a known deprecated function for TorchFix, so nothing will happen with the code.

Do you mean a case when there is more than 1 dot in the qualified name?
There is no deprecated functions with replacements like this currently, but there is a tested functionality to replace, for example, functorch.vmap() with torch.func.vmap().
And that's a different code path from the bug this PR addresses.

@izaitsevfb
Copy link
Contributor

I don't understand. init_device_mesh is not a known deprecated function for TorchFix, so nothing will happen with the code.

Do you mean a case when there is more than 1 dot in the qualified name? There is no deprecated functions with replacements like this currently, but there is a tested functionality to replace, for example, functorch.vmap() with torch.func.vmap(). And that's a different code path from the bug this PR addresses.

Sorry for not being clear. In my example I was indeed referring to the hypothetical case when init_device_mesh is deprecated, and is used in the code with multiple nested qualifying names like torch.distributed.device_mesh.init_device_mesh.

If that case is supported and tested, then awesome.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants