Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IJCAI2019.Node Embedding over Temporal Graphs #73

Open
soroush-ziaeinejad opened this issue Jan 27, 2023 · 0 comments
Open

IJCAI2019.Node Embedding over Temporal Graphs #73

soroush-ziaeinejad opened this issue Jan 27, 2023 · 0 comments
Assignees
Labels
literature-review Summary of the paper related to the work

Comments

@soroush-ziaeinejad
Copy link
Contributor

Why did I choose this paper?
This paper is directly related to the graph embedding layer of SEERa. They propose an algorithm that learn the changes in the nodes and edges of a graph over time and they claim their algorithm can be used for prediction tasks. Also, the code of this framework is also available.

Main problem:

Using temporal information from a graph to generate more comprehensive node embedding representations that preserves both network structure and dynamics.

Applications:

Social Network Analysis, Recommendation Systems, and Fraud Detection.

Existing work:

  • Deep Learning for representation learning in temporal prediction problems where the input is a graph. Minimizes the loss of representation
  • Deep Learning for temporal prediction problems where the input is a graph. Directly minimizes the loss function for a downstream prediction task
  • Feature learning

Gaps:

  • prediction task must be specified
  • high training time
  • fail to scale well across multiple time-stamps
  • discover characteristics only of snapshots of a graph (and not dynamics)

Inputs:

A temporal graph, which is a graph where the edges and/or nodes have timestamps associated with them.

Outputs:

A vector representation (embedding) of each node in the graph.

Method:

This paper presents an extension to the current method for learning features in graph data, by taking into account the temporal aspects of the graph. Other studies have attempted to improve static node embeddings by considering historical information, but this work aims to use both node and edge dynamics to create more informative embeddings for use in temporal prediction tasks. The proposed approach is unique in that it offers an end-to-end system that can be optimized for a specific task, and its evaluation demonstrates its effectiveness on various datasets for both temporal link prediction and node classification tasks when compared to other methods. The feature learning framework of this work can be described as below:
goal: finding a feature vector for each node at time t.
tasks: two major prediction tasks are considered, Node Classification (categorical cross-entropy loss) and Link Prediction (binary classification loss)
optimizer: Adam

Steps:

  • Initialization: The embeddings of nodes in the temporal graph are initialized using pre-trained embeddings of the same nodes from a static version of the graph. This can help the model converge faster and improve the overall performance of the temporal embedding.
  • Temporal Embedding: The temporal embeddings are learned by incorporating temporal information into the embeddings during the training process. This is done by using a temporal convolutional neural network (TCN) or temporal graph convolutional network (TGCN) to capture the temporal dependencies between the nodes and edges.
  • Joint Optimization: An end-to-end architecture is provided that can be jointly optimized to a given task. This allows the embeddings to be fine-tuned by incorporating task-specific information during the training process.
  • Evaluation: The approach is evaluated on various datasets for both temporal link prediction and node classification tasks, and compared to other methods. The results show that the proposed approach outperforms the relevant baselines, demonstrating its effectiveness in learning node embeddings in temporal graphs.

Experimental Setup:

Datasets:
Links are available in the paper.

  1. arXiv hep-ph: research publication graph
  2. Facebook friendships: graph of the Facebook social network. Who is a friend of whom?
  3. Facebook wall posts: graph of the Facebook social network. Who posts on another user's wall?
  4. CollegeMsg: online social network
  5. PPI: protein-protein interactions
  6. Slashdot: social news website
  7. Cora: research publication graph
  8. DBLP: bibliographic network of computer science publications

Metrics:

  • AUC
  • F1-score
  • NMI

Baselines:

  1. Node2vec
  2. Temporal Matrix Factorization (TMF) - 2011
  3. Temporally Factorized Network Modeling (TFNM) - 2017
  4. Continuous-Time Dynamic Network Embeddings (CTDNE) - 2018
  5. Hawkes process-based Temporal Network Embedding (HTNE) - 2019
  6. DynamicTriad (DynTri) - 2018

Node embedding of the last time step is considered as the final embedding (similar to SEERa)

Results:

The authors use several tasks such as link prediction, classification, and clustering to evaluate the quality of the embeddings generated by the method.

temporal link prediction: Results outperform other baselines on all datasets. However, the difference between the best model so far and tNodeEmbed is around 0.001 for some datasets. Metric: AUC. Datasets: all.
multi-label node classification: With a good superiority, tNodeEmbed outperforms other baselines for this task. Datasets: Cora and DBLP. Metrics: Micro-F1, Macro-F1, AUC

Code:

The code of this paper is available on: Code

Medium:

The medium page of this paper is available on: Medium

@soroush-ziaeinejad soroush-ziaeinejad added the literature-review Summary of the paper related to the work label Jan 27, 2023
@soroush-ziaeinejad soroush-ziaeinejad self-assigned this Jan 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
literature-review Summary of the paper related to the work
Projects
None yet
Development

No branches or pull requests

1 participant