-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
【Hackathon 7th No.39】为 Paddle 代码转换工具新增 API 转换规则(第 6 组) #477
Conversation
Thanks for your contribution! |
|
|
@Asthestarsfalll 单测失败,请保证CI通过,CI未通过不予合入:
|
@Asthestarsfalll 第1点问题,写一个屏蔽版本的单测,然后在注释备注好原因 |
paconvert/api_matcher.py
Outdated
class Softmin(paddle.nn.Softmax): | ||
def forward(self, x): | ||
return super().forward(-x) | ||
setattr(paddle.nn, 'Softmin', Softmin) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
好像没有办法获取forward的输入,所以只能这样
@Asthestarsfalll 现在CI未通过,需要修复 |
@Asthestarsfalll CI未通过,请自查问题 |
@Asthestarsfalll CI仍未通过,请自查问题 |
CI未通过 |
|
||
from apibase import APIBase | ||
|
||
obj = APIBase("torch.nn.Softmin", is_aux_api=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zhwesky2010 ci过不了应该是因为aux code,但是设置为True还是有问题
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zhwesky2010 ci过不了应该是因为aux code,但是设置为True还是有问题
你本地测试有问题吗?
- 如果本地测试就有问题,需要你自己逐行去调试。
- 如果本地测试没问题,只在CI上有问题,我们会帮忙排查或修复CI环境的问题。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个你还是要查查Softmin哪里写的不对,逐行调试
paconvert/api_matcher.py
Outdated
@@ -3914,6 +3914,36 @@ def generate_code(self, kwargs): | |||
return GenericMatcher.generate_code(self, kwargs) | |||
|
|||
|
|||
class SoftminMatcher(SoftmaxMatcher): | |||
def generate_code(self, kwargs): | |||
self.paddle_api = "paddle.nn.Softmin" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个在json里不能配吗
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
因为要修改forward,似乎不能通过json配吧
paconvert/api_matcher.py
Outdated
class Softmin(paddle.nn.Softmax): | ||
def forward(self, x): | ||
return super().forward(-x) | ||
setattr(paddle.nn, 'Softmin', Softmin) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个就不用setattr paddle了,直接调用paddle_aux.Softmin
吧,避免别人认为paddle也有paddle.nn.Softmin
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
本地第一次运行会出现同样的错误,但是第二次运行就正常了,测试了其他使用aux code的文件,也有同样的问题,比如Softmax
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
本地第一次运行会出现同样的错误,但是第二次运行就正常了,测试了其他使用aux code的文件,也有同样的问题,比如Softmax
@Asthestarsfalll 这个需要修复一下CI |
除去torch.nn.Softmin,其他的可以先提个PR,合入一版么? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- 合并下最新的吧,目前落后master的commit很多了
- 单测case务必全面,以下四种情况的测试case必须全部包含:
1. 传入所有参数且全部指定关键字
2. 传入所有参数且全部不指定关键字
3. 默认参数均不指定
4. 改变关键字顺序
paconvert/api_matcher.py
Outdated
if self._axis is None: | ||
return paddle.nn.functional.softmax(x, _get_softmax_dim(x.ndim)) | ||
return paddle.nn.functional.softmax(x, self._axis) | ||
setattr(paddle.nn.Softmax, 'forward', forward) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
那就不改paddle.nn.Softmax,直接把这些实现放到Softmin下面就行,遵循辅助函数最小原则,非必要不改动
out, loss = asfm(input,target) | ||
""" | ||
) | ||
obj.run(pytorch_code, ["out", "loss"], check_value=False) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个有随机性吗,为何关闭了check_value
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这两个都有随机初始化的权重
paconvert/api_matcher.py
Outdated
if self._axis is None: | ||
return paddle.nn.functional.softmax(x, _get_softmax_dim(x.ndim)) | ||
return paddle.nn.functional.softmax(x, self._axis) | ||
setattr(paddle.nn.Softmax, 'forward', forward) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
我的意思是:你不要去调整paddle.nn.Softmax
,这个和paddle.nn.Softmax
有什么关系呢,为什么非要去改动其他的api呢
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
已修改
PR Docs
PaddlePaddle/docs#6878
PR APIs
torch.nn.functional.lp_pool1d
torch.nn.functional.lp_pool2d
torch.nn.functional.threshold_
torch.nn.functional.feature_alpha_dropout
torch.nn.functional.scaled_dot_product_attention
torch.nn.LPPool1d
torch.nn.LPPool2d
torch.nn.Softmin
torch.nn.AdaptiveLogSoftmaxWithLoss
torch.nn.parameter.UninitializedParameter
torch.nn.parameter.UninitializedBuffer
torch.nn.CircularPad3d
torch.nn.utils.parametrizations.weight_norm
torch.optim.RAdam
torch.optim.NAdam
目前问题: