-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
First Step accuracy? #1
Comments
lr_steps may be set to 80000. Acc: 95.6, just uploaded data. It still need three days to verify the project. |
thanks @moli232777144 .i try it |
Unfortunately.Bad results.your experiment? |
samples/sec acc=0.366797 |
[lfw][530000]Accuracy-Flip: 0.99000+-0.00459 |
i feel my result is more bad. i use cmd: |
第一步的训练参数有问题, 我这边在自己的机器上训练出来的结果准确率达不到要求 |
@moli232777144 me too! now i train again just use cmd: |
uploaded!weight decay should be set to 0.00004. |
can you share me first step softmax result models? |
any progress? |
[2018-05-08 15:30:36] INFO:root:Epoch[9] Batch [1060] Speed: 588.76 samples/sec acc=0.279980 |
i find i can get 99.37% in lfw on the 40000 steps ,but i train 70000 steps ,i can get only 99.1% in lfw.maybe we should set lr =0.01 in the 40000 steps |
you can try it. i still need a day to run this experiment. |
@moli232777144 i try it .if i get goods result ,i will reports the log |
not good result .i get 99.45 in lfw and 94.50 in agedb ,so it can't be higher. |
updated.we maybe should increase the number of iterations until acc is stable. |
in the second step, I got 99.2 in lfw and 95.1 in agedb , maybe I need to continue training. |
in the fist stage ,i get this or this: lr-batch-epoch: 1e-05 5723 18 or this |
@moli232777144 have you good results? |
lr 0.1,+40000steps,lr 0.01 +20000steps,i get agedb 95.59,lfw 99.51,i will continue to extend the steps. |
thanks, I'll try it in the next time. |
i training again and again .so i now get a better result: |
good job!Modify parameters?Fine-tune process? |
you are trained on the single card? |
@moli232777144 Fine-tune .but i first train the step3 (s=128),next ,i train step 2(s=64). |
各位都是多大GPU 哈,128 batch 都不够哈 |
@xxllp 8G显存应该可以支持batch size 180; 另外把数据放到ssd可以显著提高训练速度,在1070单卡上可以达到450 samples/s |
您好,请问训练的时候打印的acc是什么精度啊 @moli232777144 |
训练的数据本身分类准确度 @zhangxiaopang88 |
哦哦,谢谢你,我用asia-celebrity数据集训练的,精度一直在0.44左右,请问您有什么训练方面的技巧,可以给点建议吗 @moli232777144 |
i have not a p40 gpu card,but i have two 1080 gpu cards; you can set batch-size 512;but i only can set 128 instead.
in your readme article ,you train 40 thousand times in your first step; so how many times i can train in my first step? in the first step ,how much can you get accuracy in lfw or agedb-30 ?
The text was updated successfully, but these errors were encountered: