One for All: Universal Topological Primitive Transfer for Graph Structure Learning

Yide Qiu, Tong Zhang, Xing Cai, Hui Yan, Zhen Cui

Advances in Neural Information Processing Systems 38 (NeurIPS 2025) Main Conference Track

The non-Euclidean geometry inherent in graph structures fundamentally impedes cross-graph knowledge transfer. Drawing inspiration from texture transfer in computer vision, we pioneer topological primitives as transferable semantic units for graph structural knowledge. To address three critical barriers - the absence of specialized benchmarks, aligned semantic representations, and systematic transfer methodologies - we present G²SN-Transfer, a unified framework comprising: (i) TopoGraph-Mapping that transforms non-Euclidean graphs into transferable sequences via topological primitive distribution dictionaries; (ii) G²SN, a dual-stream architecture learning text-topology aligned representations through contrastive alignment; and (iii) AdaCross-Transfer, a data-adaptive knowledge transfer mechanism leveraging cross-attention for both full-parameter and parameter-frozen scenarios. Particularly, G²SN is a dual-stream sequence network driven by ordinary differential equations, and our theoretical analysis establishes the convergence guarantee of G²SN. We construct STA-18, the first large-scale benchmark with aligned topological primitive-text pairs across 18 diverse graph datasets. Comprehensive evaluations demonstrate that G²SN achieves state-of-the-art performance on four structural learning tasks (average 3.2\% F1-score improvement), while our transfer method yields consistent enhancements across 13 downstream tasks (5.2\% average gains) including 10 large-scale graph datasets. The datasets and code are available at https://anonymous.4open.science/r/UGSKT-C10E/.