BingoCGN employs cross-partition message quantization to summarize inter-partition message flow, which eliminates the need for irregular off-chip memory access and utilizes a fine-grained structured ...
Scholars deliver the first systematic survey of Dynamic GNNs, unifying continuous- and discrete-time models, benchmarking ...
Learn about the most prominent types of modern neural networks such as feedforward, recurrent, convolutional, and transformer networks, and their use cases in modern AI. Neural networks are the ...
High sparse Knowledge Graph is a key challenge to solve the Knowledge Graph Completion task. Due to the sparsity of the KGs, there are not enough first-order neighbors to learn the features of ...
Representing a molecule in a way that captures both its structure and function is central to tasks such as molecular property prediction, drug drug ...
A new study led by researchers from the Yunnan Observatories of the Chinese Academy of Sciences has developed a neural network-based method for large-scale celestial object classification, according ...
Adapting to the Stream: An Instance-Attention GNN Method for Irregular Multivariate Time Series Data
DynIMTS replaces static graphs with instance-attention that updates edge weights on the fly, delivering SOTA imputation and P12 classification ...
A team led by Guoyin Yin at Wuhan University and the Shanghai Artificial Intelligence Laboratory recently proposed a modular machine learning ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results