Skip to main content
Figure 3 | EPJ Data Science

Figure 3

From: Time-varying graph representation learning via higher-order skip-gram with negative sampling

Figure 3

Two-dimensional projections of the 128-dim embedding manifold spanned by embedding matrices W (left of each panel) and T (right of each panel), trained on LyonSchool data, of HOSGNS model trained on: (a) \(\boldsymbol{\mathcal{P}}^{(\text{stat})}\) and (b) \(\boldsymbol{\mathcal{P}}^{(\text{dyn})}\). These plots show how the community structure and the evolution of time is captured by individual node embeddings \(\{\mathbf{w}_{i}\}_{i \in \mathcal{V}}\) and time embeddings \(\{\mathbf{t}_{k}\}_{k \in \mathcal{T}}\)

Back to article page