随笔分类 - Domain Adaption
摘要:Cao J., Lin X., Cong X., Ya J., Liu T. and Wang B. DisenCDR: learning disentangled representations for cross-domain recommendation. In ACM SIGIR Confe
阅读全文
摘要:Feldman V. and Zhang C. What neural networks memorize and why: discovering the long tail via influence estimation. In Advances in Neural Information P
阅读全文
摘要:Feldman V. Does learning require memorization? a short tale about a long tail. In Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Com
阅读全文
摘要:Ganin Y. and Lempitsky V. Unsupervised Domain Adaptation by Backpropagation. ICML 2015. 概 监督学习非常依赖标签数据, 但是获得大量的标签数据在现实中是代价昂贵的一件事情, 这也是为何半监督和无监督重要的原因.
阅读全文
摘要:BBN: Bilateral-Branch Network with Cumulative Learning for Long-Tailed Visual Recognition Zhu B., Cui Q., Wei X. and Chen Z. BBN: Bilateral-Branch Net
阅读全文
摘要:Huang J., Smola A., Gretton A., Borgwardt K. & Scholkopf B. Correcting Sample Selection Bias by Unlabeled Data. NIPS, 2007. 概 MMD量化了两组数据是否来自同一个分布的可能性,
阅读全文
摘要:Hinton G., Vinyals O. & Dean J. Distilling the Knowledge in a Neural Network. arXiv preprint arXiv 1503.02531 概 \[ q_1 = \frac{\exp(z_i/T)}{\sum_j \ex
阅读全文
摘要:Tian Y., Krishnan D., Isola P. CONTRASTIVE REPRESENTATION DISTILLATION. arXiv preprint arXiv 1910.10699, 2019. 概 感觉其和此的相似度有50%, 不过这篇写得早一点, 所以后者是借鉴了这篇文
阅读全文
摘要:Neyshabur B., Sedghi H., Zhang C. What is being transferred in transfer learning? arXiv preprint arXiv 2008.11687, 2020. 概 迁移学习到底迁移了什么? 主要内容 T: 普通训练的模
阅读全文
摘要:Bai T., Chen J., Zhao J., Wen B., Jiang X., Kot A. Feature Distillation With Guided Adversarial Contrastive Learning. arXiv preprint arXiv 2009.09922,
阅读全文
摘要:Pirmin Lemberger, Ivan Panico, A Primer on Domain Adaptation Theory and Applications, 2019. 概 机器学习分为训练和测试俩步骤, 且往往假设训练样本的分布和测试样本的分布是一致的, 但是这种情况在实际中并不一定
阅读全文