随笔分类 -  Representation Learning

摘要:[TOC] > [Zhou D., Bousquet O., Lal T. N., Weston J. and Scholk\ddot{o}pf B. Learning with local and global consistency. NIPS, 2004.](https://proceedin 阅读全文
posted @ 2023-05-23 09:44 馒头and花卷 阅读(159) 评论(0) 推荐(0) 编辑
摘要:[TOC] > [Huang Q., He H., Singh A., Lim S. and Benson A. R. Combining label propagation and simple models out-performs graph neural networks. ICLR, 20 阅读全文
posted @ 2023-05-22 15:58 馒头and花卷 阅读(39) 评论(0) 推荐(0) 编辑
摘要:Yuan H. and Ji S. Structpool: structured graph pooling via conditional random fields. ICLR, 2020. 概 一种图的 pooling 方法, 我并没有搞懂其中的原理, 这里只是记录一下. 符号说明 G, 阅读全文
posted @ 2023-05-15 21:27 馒头and花卷 阅读(40) 评论(0) 推荐(0) 编辑
摘要:Graikos A., Malkin N., Jojic N. and Samaras D. Diffusion models as plug-and-play priors. NIPS, 2022. 概 有了先验分布 p(x) (用一般的扩散模型去拟合), 我们总是像添加一些 阅读全文
posted @ 2023-05-10 20:47 馒头and花卷 阅读(192) 评论(0) 推荐(0) 编辑
摘要:Tang J. and Wang K. Personalized top-n sequential recommendation via convolutional sequence embedding. WSDM, 2018. 概 序列推荐的经典之作, 将卷积用在序列推荐之上. 符号说明 $\ma 阅读全文
posted @ 2023-05-09 21:19 馒头and花卷 阅读(92) 评论(0) 推荐(0) 编辑
摘要:Ethayarajh K., Choi Y. and Swayamdipta S. Understanding dataset difficulty with V-usable information. ICML, 2022. 概 将 V-inform 阅读全文
posted @ 2023-05-08 13:34 馒头and花卷 阅读(191) 评论(0) 推荐(0) 编辑
摘要:Gupta U., Ferber A. M., Dilkina B. and Steeg G. V. Controllable guarantees for fair outcomes via contrastive information estimation. AAAI, 2021. 概 本文提 阅读全文
posted @ 2023-05-06 14:36 馒头and花卷 阅读(25) 评论(0) 推荐(0) 编辑
摘要:Ma Y., Wang S., Aggarwal C. C. and Tang J. Graph convolutional networks with eigenpooling. KDD, 2019. 概 本文提出了一种新的框架, 在前向的过程中, 可以逐步将相似的 nodes 和他们的特征聚合在 阅读全文
posted @ 2023-05-05 11:21 馒头and花卷 阅读(102) 评论(0) 推荐(0) 编辑
摘要:Chiang W., Liu X., Si S., Li Y., Bengio S. and Hsieh C. Cluster-GCN: An efficient algorithm for training deep and large graph convolutional networks. 阅读全文
posted @ 2023-04-27 16:43 馒头and花卷 阅读(50) 评论(0) 推荐(0) 编辑
摘要:Cen Y., Zou X., Zhang J., Yang H., Zhou J. and Tang J. Representation learning for attributed multiplex heterogeneous network. KDD, 2019. 概 本文在 Attrib 阅读全文
posted @ 2023-04-27 16:42 馒头and花卷 阅读(77) 评论(0) 推荐(0) 编辑
摘要:Maron H., Ben-Hamu H., Shamir N. and Lipman Y. Invariant and equivariant graph networks. ICLR, 2019. 概 有些时候, 我们希望网络具有: 不变性 (Invariant): $$ f(PX) = f(X 阅读全文
posted @ 2023-04-24 16:22 馒头and花卷 阅读(86) 评论(0) 推荐(0) 编辑
摘要:目录概符号说明MotivationLADIES代码 Zou D., Hu Z., Wang Y., Jiang S., Sun Y. and Gu Q. Layer-dependent importance sampling for training deep and large graph con 阅读全文
posted @ 2023-04-20 19:49 馒头and花卷 阅读(70) 评论(0) 推荐(0) 编辑
摘要:Wang X., Ji H., Shi C., Wang B., Cui P., Yu P. and Ye Y. Heterogeneous graph attention network. WWW, 2019. 概 Attention + 异构图. 符号说明 $\mathcal{G} = (\ma 阅读全文
posted @ 2023-04-19 13:44 馒头and花卷 阅读(53) 评论(0) 推荐(0) 编辑
摘要:Ren Y., Liu B., Huang C., Dai P., Bo L. and Zhang J. Heterogeneous deep graph infomax. arXiv preprint arXiv:1911.08538, 2019. 概 本文介绍了异构图的一种无监督学习方法. 这里 阅读全文
posted @ 2023-04-19 11:55 馒头and花卷 阅读(61) 评论(0) 推荐(0) 编辑
摘要:目录概符号说明MotivationFastGCN方差分析代码 Chen J., Ma T. and Xiao C. FastGCN: fast learning with graph convolutional networks via importance sampling. ICLR, 2018 阅读全文
posted @ 2023-04-16 17:15 馒头and花卷 阅读(116) 评论(0) 推荐(0) 编辑
摘要:Li Q., Han Z. and Wu X. Deeper insights into graph convolutional networks for semi-supervised learning. AAAI, 2018. 概 本文分析了 GCN 的实际上就是一种 Smoothing, 但是 阅读全文
posted @ 2023-04-16 14:45 馒头and花卷 阅读(41) 评论(0) 推荐(0) 编辑
摘要:目录概符号说明Motivation本文方法代码 Chen J., Zhu J. and Song L. Stochastic training of graph convolutional networks with variance reduction. ICML, 2018. 概 我们都知道, 阅读全文
posted @ 2023-04-15 15:52 馒头and花卷 阅读(89) 评论(0) 推荐(0) 编辑
摘要:Xu Y., Zhao S., Song J., Stewart R. and Ermon S. A theory of usable information under computational constraints. International Conference on Learning 阅读全文
posted @ 2023-04-03 16:00 馒头and花卷 阅读(125) 评论(0) 推荐(0) 编辑
摘要:Gao Z., Guo J., Tan X., Zhu Y., Zhang F., Bian J. and Xu L. Difformer: Empowering diffusion models on the embedding space for text generation. arXiv p 阅读全文
posted @ 2023-03-27 17:33 馒头and花卷 阅读(100) 评论(0) 推荐(0) 编辑

点击右上角即可分享
微信分享提示