delay
A Gentle Introduction to Deep Learning for Graphs
tutorial introduction to Deep Learning for Graphs
what's deep learning for Graph?
abstract
The paper takes a top-down view to the problem, introducing a generalized formulation of graph representation learning based on a local and iterative approachto structured information processing.
- what does adj. top-down mean?
- what is generalized formulation?
- what does the whole sentence mean?
It introduces the basic building blocks that can be combined to design novel and effective neural models for graphs.
- in what way can it be combined to design neural models?
- what's neural models for graphs?
The methodological exposition is complemented by a discussion of interesting research challenges and applications in the field.
- what's methodological exposition?
- how is it complemented by discussion?
introduction
background
- graph is powerful
- require deep learning modelsthat can process graphs in an adaptive fashion.
- history:long-standing and consolidated history,rooting in the early nineties with seminal works on Recursive Neural Networks(RecNN) for tree structured data.....
- ???
This paper takes pace from this historical perspective to provide agentleintroduction to the field of neural networks for graphs, also referred to as deeplearning for graphs in modern terminology
purpose of this paper
.....
guidance
Section 2
we first provide a generalized formulation of the problem of representation learning in graphs, introducing and motivating the architecture roadmap that we will be following throughout the rest of the paper.
We will focus, in particular, on approaches that deal with local and iterative processing of information.
Section 3
introduce the basic building blocks that can be assembled and combined to create modern deep learning architectures for graphs.
In this context, we will introduce the concepts of graph convolutions as local neighborhood aggregation functions, the use of attention, sampling andpooling operators defined over graphs, and we will conclude with a discussionon aggregation functions that compute whole-structure embeddings.
???
Section 4
main learning tasks undertaken in graph representation learning
together with the associated cost functions and a characterization of the related induc-tive biases.
???
The final part of the paper surveys other related approaches andtasks (Section 5), and it discusses interesting research challenges (Section 6) andapplications (Section 7). We conclude the paper with some final considerationsand hints for future research directions.
concepts and key words
Convolutional Neural Networks
卷积神经网络
deep graph network
DGNS
深图网络
DNGN
DBGN
DGGN
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】凌霞软件回馈社区,博客园 & 1Panel & Halo 联合会员上线
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】博客园社区专享云产品让利特惠,阿里云新客6.5折上折
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步