关于Mobile-net的迁移结果

一、Mobile-netv2

1、池化层后:

使用预训练网络的结果为:

====> Recall@1: 0.5134
====> Recall@5: 0.7588
====> Recall@10: 0.8370
====> Recall@20: 0.8975

 

2、删除1280层:

删除掉最后一层conv,即encoder_dim变成320

 

====> Recall@1: 0.5475
====> Recall@5: 0.7721
====> Recall@10: 0.8492
====> Recall@20: 0.9059

甚至更高

记录用时:

6.7551679611206055
8.487680435180664
8.224127769470215
7.55020809173584
7.8230719566345215
7.158783912658691
8.481663703918457
7.0414719581604
8.832768440246582
7.55785608291626
6.857600212097168
6.9655680656433105
6.981215953826904
6.499904155731201
7.707104206085205
6.428639888763428
7.851103782653809
6.991392135620117
6.8152642250061035
8.19983959197998
7.4161601066589355

快一倍

 

采用这种方式使用ADAM,lr=0.00001 , 冻结Mobile-net来进行实验

====> Recall@1: 0.2705
====> Recall@5: 0.4816
====> Recall@10: 0.5903
====> Recall@20: 0.7033

 

还是低,采用SGD,学习率0.000001

训练5个epoch,结果起伏,结果见~/packages/slam_loop/mobilenetv3-master/pytorch-NetVlad-master/runs/SGD_lr0P000001

文件夹下还有ADAM的5个epoch结果

posted @ 2022-06-13 09:34  小咸鱼在看博客  阅读(55)  评论(0编辑  收藏  举报