Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Sci Rep ; 13(1): 16966, 2023 Oct 08.
Article in English | MEDLINE | ID: mdl-37807013

ABSTRACT

Graph neural networks (GNNs) have significant advantages in dealing with non-Euclidean data and have been widely used in various fields. However, most of the existing GNN models face two main challenges: (1) Most GNN models built upon the message-passing framework exhibit a shallow structure, which hampers their ability to efficiently transmit information between distant nodes. To address this, we aim to propose a novel message-passing framework, enabling the construction of GNN models with deep architectures akin to convolutional neural networks (CNNs), potentially comprising dozens or even hundreds of layers. (2) Existing models often approach the learning of edge and node features as separate tasks. To overcome this limitation, we aspire to develop a deep graph convolutional neural network learning framework capable of simultaneously acquiring edge embeddings and node embeddings. By utilizing the learned multi-dimensional edge feature matrix, we construct multi-channel filters to more effectively capture accurate node features. To address these challenges, we propose the Co-embedding of Edges and Nodes with Deep Graph Convolutional Neural Networks (CEN-DGCNN). In our approach, we propose a novel message-passing framework that can fully integrate and utilize both node features and multi-dimensional edge features. Based on this framework, we develop a deep graph convolutional neural network model that prevents over-smoothing and obtains node non-local structural features and refined high-order node features by extracting long-distance dependencies between nodes and utilizing multi-dimensional edge features. Moreover, we propose a novel graph convolutional layer that can learn node embeddings and multi-dimensional edge embeddings simultaneously. The layer updates multi-dimensional edge embeddings across layers based on node features and an attention mechanism, which enables efficient utilization and fusion of both node and edge features. Additionally, we propose a multi-dimensional edge feature encoding method based on directed edges, and use the resulting multi-dimensional edge feature matrix to construct a multi-channel filter to filter the node information. Lastly, extensive experiments show that CEN-DGCNN outperforms a large number of graph neural network baseline methods, demonstrating the effectiveness of our proposed method.

2.
PLoS One ; 18(3): e0279604, 2023.
Article in English | MEDLINE | ID: mdl-36897837

ABSTRACT

Graph Convolutional Networks (GCNs) are powerful deep learning methods for non-Euclidean structure data and achieve impressive performance in many fields. But most of the state-of-the-art GCN models are shallow structures with depths of no more than 3 to 4 layers, which greatly limits the ability of GCN models to extract high-level features of nodes. There are two main reasons for this: 1) Overlaying too many graph convolution layers will lead to the problem of over-smoothing. 2) Graph convolution is a kind of localized filter, which is easily affected by local properties. To solve the above problems, we first propose a novel general framework for graph neural networks called Non-local Message Passing (NLMP). Under this framework, very deep graph convolutional networks can be flexibly designed, and the over-smoothing phenomenon can be suppressed very effectively. Second, we propose a new spatial graph convolution layer to extract node multiscale high-level node features. Finally, we design an end-to-end Deep Graph Convolutional Neural Network II (DGCNNII) model for graph classification task, which is up to 32 layers deep. And the effectiveness of our proposed method is demonstrated by quantifying the graph smoothness of each layer and ablation studies. Experiments on benchmark graph classification datasets show that DGCNNII outperforms a large number of shallow graph neural network baseline methods.


Subject(s)
Benchmarking , Neural Networks, Computer
SELECTION OF CITATIONS
SEARCH DETAIL
...