Skip to main content

Table 7 Number of trainable parameters and training time of each time-varying graph representation learning model for LyonSchool and the two synthetic datasets. The embedding dimension is fixed to 128, technical specifications of the computing system and hyper-parameters configuration are reported in Additional file 1

From: Time-varying graph representation learning via higher-order skip-gram with negative sampling

Model

Dataset

LyonSchool

OpenABM-2k-100

OpenABM-5k-20

\(|\mathcal{V}|=242\), \(|\mathcal{T}|=104\)

\(|\mathcal{V}|=2000\), \(|\mathcal{T}|=100\)

\(|\mathcal{V}|=5000\), \(|\mathcal{T}|=20\)

Tr. parameters

Tr. time

Tr. parameters

Tr. time

Tr. parameters

Tr. time

DyANE

4,396,544

\( 62~\text{s} \)

50,825,472

\( 1014~\text{s} \)

25,591,296

\( 448~\text{s} \)

DynGEM

459,270

\( 516~\text{s} \)

1,867,428

\( 10\text{,}765~\text{s} \)

4,270,428

\( 23\text{,}307~\text{s} \)

DynamicTriad

3,221,632

\( 1131~\text{s} \)

25,600,128

\( 17\text{,}191~\text{s} \)

12,800,128

\( 12\text{,}625~\text{s} \)

DySAT

98,336

\( 18\text{,}323~\text{s} \)

323,232

\( 152\text{,}976~\text{s} \)

707,232

\( 8958~\text{s} \)

ISGNS

61,952

\( 381~\text{s} \)

512,000

\( 5895~\text{s} \)

1,280,000

\( 3062~\text{s} \)

\(\text{HOSGNS} ^{(\text{stat})}\)

75,264

\( 316~\text{s} \)

524,800

\( 548~\text{s} \)

1,282,560

\( 724~\text{s} \)

\(\text{HOSGNS} ^{(\text{dyn})}\)

88,576

\( 303~\text{s} \)

537,600

\( 565~\text{s} \)

1,285,120

\( 734~\text{s} \)