PyGCL: A PyTorch Library for Graph Contrastive Learning

logo

PyGCL is a PyTorch-based open-source Graph Contrastive Learning (GCL) library, which features modularized GCL components from published papers, standardized evaluation, and experiment management.

Made with Python PyPI version Documentation Status GitHub stars GitHub forks Total lines visitors


What is Graph Contrastive Learning?

Graph Contrastive Learning (GCL) establishes a new paradigm for learning graph representations without human annotations. A typical GCL algorithm firstly constructs multiple graph views via stochastic augmentation of the input and then learns representations by contrasting positive samples against negative ones.

👉 For a general introduction of GCL, please refer to our paper and blog. Also, this repo tracks newly published GCL papers.

Install

Prerequisites

PyGCL needs the following packages to be installed beforehand:

  • Python 3.8+
  • PyTorch 1.9+
  • PyTorch-Geometric 1.7
  • DGL 0.7+
  • Scikit-learn 0.24+
  • Numpy
  • tqdm
  • NetworkX

Installation via PyPI

To install PyGCL with pip, simply run:

pip install PyGCL

Then, you can import GCL from your current environment.

A note regarding DGL

Currently the DGL team maintains two versions, dgl for CPU support and dgl-cu*** for CUDA support. Since pip treats them as different packages, it is hard for PyGCL to check for the version requirement of dgl. We have removed such dependency checks for dgl in our setup configuration and require the users to install a proper version by themselves.

Package Overview

Our PyGCL implements four main components of graph contrastive learning algorithms:

  • Graph augmentation: transforms input graphs into congruent graph views.
  • Contrasting architectures and modes: generate positive and negative pairs according to node and graph embeddings.
  • Contrastive objectives: computes the likelihood score for positive and negative pairs.
  • Negative mining strategies: improves the negative sample set by considering the relative similarity (the hardness) of negative sample.

We also implement utilities for training models, evaluating model performance, and managing experiments.

Implementations and Examples

For a quick start, please check out the examples folder. We currently implemented the following methods:

  • DGI (P. Veličković et al., Deep Graph Infomax, ICLR, 2019) [Example1, Example2]
  • InfoGraph (F.-Y. Sun et al., InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization, ICLR, 2020) [Example]
  • MVGRL (K. Hassani et al., Contrastive Multi-View Representation Learning on Graphs, ICML, 2020) [Example1, Example2]
  • GRACE (Y. Zhu et al., Deep Graph Contrastive Representation Learning, [email protected], 2020) [Example]
  • GraphCL (Y. You et al., Graph Contrastive Learning with Augmentations, NeurIPS, 2020) [Example]
  • SupCon (P. Khosla et al., Supervised Contrastive Learning, NeurIPS, 2020) [Example]
  • HardMixing (Y. Kalantidis et al., Hard Negative Mixing for Contrastive Learning, NeurIPS, 2020)
  • DCL (C.-Y. Chuang et al., Debiased Contrastive Learning, NeurIPS, 2020)
  • HCL (J. Robinson et al., Contrastive Learning with Hard Negative Samples, ICLR, 2021)
  • Ring (M. Wu et al., Conditional Negative Sampling for Contrastive Learning of Visual Representations, ICLR, 2021)
  • Exemplar (N. Zhao et al., What Makes Instance Discrimination Good for Transfer Learning?, ICLR, 2021)
  • BGRL (S. Thakoor et al., Bootstrapped Representation Learning on Graphs, arXiv, 2021) [Example1, Example2]
  • G-BT (P. Bielak et al., Graph Barlow Twins: A Self-Supervised Representation Learning Framework for Graphs, arXiv, 2021) [Example]
  • VICReg (A. Bardes et al., VICReg: Variance-Invariance-Covariance Regularization for Self-Supervised Learning, arXiv, 2021)

Building Your Own GCL Algorithms

Besides try the above examples for node and graph classification tasks, you can also build your own graph contrastive learning algorithms straightforwardly.

Graph Augmentation

In GCL.augmentors, PyGCL provides the Augmentor base class, which offers a universal interface for graph augmentation functions. Specifically, PyGCL implements the following augmentation functions:

Augmentation Class name
Edge Adding (EA) EdgeAdding
Edge Removing (ER) EdgeRemoving
Feature Masking (FM) FeatureMasking
Feature Dropout (FD) FeatureDropout
Edge Attribute Masking (EAR) EdgeAttrMasking
Personalized PageRank (PPR) PPRDiffusion
Markov Diffusion Kernel (MDK) MarkovDiffusion
Node Dropping (ND) NodeDropping
Node Shuffling (NS) NodeShuffling
Subgraphs induced by Random Walks (RWS) RWSampling
Ego-net Sampling (ES) Identity

Call these augmentation functions by feeding with a Graph in a tuple form of node features, edge index, and edge features (x, edge_index, edge_attrs) will produce corresponding augmented graphs.

Composite Augmentations

PyGCL supports composing arbitrary numbers of augmentations together. To compose a list of augmentation instances augmentors, you need to use the Compose class:

import GCL.augmentors as A

aug = A.Compose([A.EdgeRemoving(pe=0.3), A.FeatureMasking(pf=0.3)])

You can also use the RandomChoice class to randomly draw a few augmentations each time:

import GCL.augmentors as A

aug = A.RandomChoice([A.RWSampling(num_seeds=1000, walk_length=10),
                      A.NodeDropping(pn=0.1),
                      A.FeatureMasking(pf=0.1),
                      A.EdgeRemoving(pe=0.1)],
                     num_choices=1)

Customizing Your Own Augmentation

You can write your own augmentation functions by inheriting the base Augmentor class and defining the augment function.

Contrasting Architectures and Modes

Existing GCL architectures could be grouped into two lines: negative-sample-based methods and negative-sample-free ones.

  • Negative-sample-based approaches can either have one single branch or two branches. In single-branch contrasting, we only need to construct one graph view and perform contrastive learning within this view. In dual-branch models, we generate two graph views and perform contrastive learning within and across views.
  • Negative-sample-free approaches eschew the need of explicit negative samples. Currently, PyGCL supports the bootstrap-style contrastive learning as well contrastive learning within embeddings (such as Barlow Twins and VICReg).
Contrastive architectures Supported contrastive modes Need negative samples Class name Examples
Single-branch contrasting G2L only SingleBranchContrast DGI, InfoGraph
Dual-branch contrasting L2L, G2G, and G2L DualBranchContrast GRACE
Bootstrapped contrasting L2L, G2G, and G2L BootstrapContrast BGRL
Within-embedding contrasting L2L and G2G WithinEmbedContrast GBT

Moreover, you can use add_extra_mask if you want to add positives or remove negatives. This function performs bitwise ADD to extra positive masks specified by extra_pos_mask and bitwise OR to extra negative masks specified by extra_neg_mask. It is helpful, for example, when you have supervision signals from labels and want to train the model in a semi-supervised manner.

Internally, PyGCL calls Sampler classes in GCL.models that receive embeddings and produce positive/negative masks. PyGCL implements three contrasting modes: (a) Local-Local (L2L), (b) Global-Global (G2G), and (c) Global-Local (G2L) modes. L2L and G2G modes contrast embeddings at the same scale and the latter G2L one performs cross-scale contrasting. To implement your own GCL model, you may also use these provided sampler models:

Contrastive modes Class name
Same-scale contrasting (L2L and G2G) SameScaleSampler
Cross-scale contrasting (G2L) CrossScaleSampler
  • For L2L and G2G, embedding pairs of the same node/graph in different views constitute positive pairs. You can refer to GRACE and GraphCL for examples.
  • For G2L, node-graph embedding pairs form positives. Note that for single-graph datasets, the G2L mode requires explicit negative sampling (otherwise no negatives for contrasting). You can refer to DGI for an example.
  • Some models (e.g., GRACE) add extra intra-view negative samples. You may manually call sampler.add_intraview_negs to enlarge the negative sample set.
  • Note that the bootstrapping latent model involves some special model design (asymmetric online/offline encoders and momentum weight updates). You may refer to BGRL for details.

Contrastive Objectives

In GCL.losses, PyGCL implements the following contrastive objectives:

Contrastive objectives Class name
InfoNCE loss InfoNCE
Jensen-Shannon Divergence (JSD) loss JSD
Triplet Margin (TM) loss Triplet
Bootstrapping Latent (BL) loss BootstrapLatent
Barlow Twins (BT) loss BarlowTwins
VICReg loss VICReg

All these objectives are able to contrast any arbitrary positive and negative pairs, except for Barlow Twins and VICReg losses that perform contrastive learning within embeddings. Moreover, for InfoNCE and Triplet losses, we further provide SP variants that computes contrastive objectives given only one positive pair per sample to speed up computation and avoid excessive memory consumption.

Negative Sampling Strategies

PyGCL further implements several negative sampling strategies:

Negative sampling strategies Class name
Subsampling GCL.models.SubSampler
Hard negative mixing GCL.models.HardMixing
Conditional negative sampling GCL.models.Ring
Debiased contrastive objective GCL.losses.DebiasedInfoNCE , GCL.losses.DebiasedJSD
Hardness-biased negative sampling GCL.losses.HardnessInfoNCE, GCL.losses.HardnessJSD

The former three models serve as an additional sampling step similar to existing Sampler ones and can be used in conjunction with any objectives. The last two objectives are only for InfoNCE and JSD losses.

Utilities

PyGCL provides a variety of evaluator functions to evaluate the embedding quality:

Evaluator Class name
Logistic regression LREvaluator
Support vector machine SVMEvaluator
Random forest RFEvaluator

To use these evaluators, you first need to generate dataset splits by get_split (random split) or by from_predefined_split (according to preset splits).

Contribution

Feel free to open an issue should you find anything unexpected or create pull requests to add your own work! We are motivated to continuously make PyGCL even better.

Citation

Please cite our paper if you use this code in your own work:

@article{Zhu:2021tu,
author = {Zhu, Yanqiao and Xu, Yichen and Liu, Qiang and Wu, Shu},
title = {{An Empirical Study of Graph Contrastive Learning}},
journal = {arXiv.org},
year = {2021},
eprint = {2109.01116v1},
eprinttype = {arxiv},
eprintclass = {cs.LG},
month = sep,
}
Owner
PyGCL
A PyTorch Library for Graph Contrastive Learning
PyGCL
Comments
  • Installation problem for people in China

    Installation problem for people in China

    Installation Issue, While I already have installed dgl manually. but it looks like you have a requirements file that imposing to install the wrong version or something. note: I am trying to install via pip mirror because I am in China. I tried both USTC and Tshinghua mirror and both ends up with the same error.

    DGL latest version is 0.6.1 as per the below screenshot, while Pygcl requires 0.7 ? I already installed the latest version as shown in error. pip install -i https://mirrors.aliyun.com/pypi/simple/ dgl==0.7a210527

    What I am doing wrong here?

    image

    image

  • Can't import GCL after installation

    Can't import GCL after installation

    My installation is complete but I can't import it, Even I verified its installation location, I can import other packages in the same folder but can not import this one.

    I have posted the question on StackOverflow https://stackoverflow.com/questions/69527466/package-import-error-after-installation-with-pip-install

    It has screenshots and a detailed description.

  • About the installation

    About the installation

    I have installed dgl-cu11 0.7.2. When I installed pygcl, I was prompted with "error: no matching distribution found for DGL > = 0.7 (from pygcl)". Do I have to install DGL without CUDA?

  • Why did not implement

    Why did not implement "Graph Contrastive Learning with Adaptive Augmentation"

    I am reading your paper "Graph Contrastive Learning with Adaptive Augmentation" and want to compare its results with my work, but I see you did not implement this paper in PyGCL. Any specific reason?

  • PyGCL installation not finding DGL

    PyGCL installation not finding DGL

    I have tried installing PyGCL with multiple versions of DGL >= 0.7 I am running on Centos 7 in an HPC environment. Here is the error I am receiving: ERROR: Could not find a version that satisfies the requirement dgl>=0.7 (from PyGCL)

    Steps to reproduce:

    [email protected]:~$ pip3.8 install --user PyGCL
    Collecting PyGCL
      Using cached PyGCL-0.1.1-py3-none-any.whl (32 kB)
    Requirement already satisfied: scipy in /opt/ohpc/pub/spack/opt/spack/linux-centos7-x86_64/gcc-9.2.0/py-scipy-1.6.3-pz52lt4qo22xudj5rxfswm3ohjaed2t5/lib/python3.8/site-packages (from PyGCL) (1.6.3)
    ERROR: Could not find a version that satisfies the requirement dgl>=0.7 (from PyGCL) (from versions: 0.1.0, 0.1.2, 0.1.3, 0.4.3, 0.4.3.post1, 0.4.3.post2, 0.5.0, 0.5.1, 0.5.2, 0.5.3, 0.6.0, 0.6.0.post1, 0.6.1, 0.7a210406, 0.7a210407, 0.7a210408, 0.7a210409, 0.7a210410, 0.7a210412, 0.7a210413, 0.7a210414, 0.7a210415, 0.7a210416, 0.7a210420, 0.7a210421, 0.7a210422, 0.7a210423, 0.7a210424, 0.7a210425, 0.7a210426, 0.7a210427, 0.7a210429, 0.7a210501, 0.7a210503, 0.7a210506, 0.7a210507, 0.7a210508, 0.7a210511, 0.7a210512, 0.7a210513, 0.7a210514, 0.7a210515, 0.7a210517, 0.7a210518, 0.7a210519, 0.7a210520, 0.7a210525, 0.7a210527)
    ERROR: No matching distribution found for dgl>=0.7 (from PyGCL)
    
    [email protected]:~$ pip3.8 list
    Package               Version
    --------------------- ----------
    certifi               2021.10.8
    charset-normalizer    2.0.7
    decorator             5.1.0
    dgl                   0.7a210525
    googledrivedownloader 0.4
    idna                  3.3
    isodate               0.6.0
    Jinja2                3.0.2
    joblib                1.0.1
    MarkupSafe            2.0.1
    networkx              2.2
    numpy                 1.20.1
    pandas                1.3.4
    pip                   20.2
    pyparsing             3.0.4
    python-dateutil       2.8.2
    pytz                  2021.3
    PyYAML                6.0
    rdflib                6.0.2
    requests              2.26.0
    scikit-learn          0.24.1
    scipy                 1.6.3
    setuptools            50.3.2
    six                   1.16.0
    threadpoolctl         2.0.0
    torch                 1.10.0
    torch-geometric       2.0.2
    torch-scatter         2.0.9
    torch-sparse          0.6.12
    tqdm                  4.59.0
    typing-extensions     3.10.0.2
    urllib3               1.26.7
    yacs                  0.1.8
    
    [email protected]:~$ module list
    Currently Loaded Modules:
      1) autotools                        9) gcc-9.2.0-gcc-8.3.0-ebpgkrt
      2) prun/1.3                        10) cuda-11.2.0-gcc-9.2.0-3fwlgae
      3) gnu8/8.3.0                      11) cudnn-8.0.4.30-11.1-gcc-9.2.0-fyvouhn
      4) ohpc                            12) py-pip-20.2-gcc-9.2.0-d66cbwk
      5) gcc/1                           13) py-scikit-learn-0.24.1-gcc-9.2.0-srlkj6p
      6) slurm/1                         14) py-numpy-1.20.1-gcc-9.2.0-25bs7fj
      7) openmpi/4.1.0                   15) py-tqdm-4.59.0-gcc-9.2.0-jliepte
      8) python-3.8.7-gcc-9.2.0-fn3m3au  16) py-networkx-2.2-gcc-8.3.0-ovwwomc
    

    Any help would be greatly appreciated!

  • Could not find dgl>=0.7 when installing PyGCL

    Could not find dgl>=0.7 when installing PyGCL

    Hello pygcl team, thanks for your excellent work!

    It seems that the latest version of DGL is 0.6.1 when I install it following the official instruction. Then I meet the error "Could not find dgl>=0.7" when I try to install PyGCL. Could you please give me some suggestions to fix this issue?

  • About the ''edge_weight'' in ''PPRDiffusion'' augment

    About the ''edge_weight'' in ''PPRDiffusion'' augment

    Thanks for opensourcing.

    When I read examples "MVGRL_graph.py", I found only augmented 'edge_index2' and 'x2' were used as the input of 'gcn2'. Why did not input the augmented ''edge_weight2'' to 'gcn2' ?

    It would be appreciated if anyone could help tell me the reason.

  • Is there anyway to read the documentation?

    Is there anyway to read the documentation?

    Thanks for opensourcing such good contribution. I wrote a test code like this:

    import torch
    import GCL.augmentors as A
    aug=A.Compose([A.EdgeAdding(pe=0.4),A.FeatureDropout(pf=0.5)])
    edge_index=torch.randint(0,10,(2,10)) 
    x=torch.randn((10,128))
    auged=aug(x,edge_index)
    print(auged)
    

    and it got this:

    num_edges = edge_index.size()[1]
    IndexError: tuple index out of range
    

    It would be appreciated if anyone could help tell me how to read the documentation.

    After I checked the code in add_edge function in functional.py, there could be a little problem with these piece of code:

    def add_edge(edge_index: torch.Tensor, ratio: float) -> torch.Tensor:
        num_edges = edge_index.size()[1]
        num_nodes = edge_index.max().item() + 1
        num_add = int(num_edges * ratio)
    
        new_edge_index = torch.randint(0, num_nodes - 1, size=(2, num_add)).to(edge_index.device)
        edge_index = torch.cat([edge_index, new_edge_index], dim=1)
        
        # here it could be wrongly written. [0] might be removed.
        edge_index = sort_edge_index(edge_index)[0]
    
    
        return coalesce_edge_index(edge_index)[0]
    
    
  • How to run GraphCL with Node classification tasks?

    How to run GraphCL with Node classification tasks?

    Dear authors, this is a great job in Graph Contrastive learning field. But I am encountering 2 small problems.

    1. When I run GRACE with large datasets such as WikiCS, then the CUDA out of memory problem occurs.
    2. How to run GraphCL on node classification problem? I am much thankful if you can reply to me !
  • Can't find trial.py

    Can't find trial.py

    Thanks for opensourcing.

    When I read "A.2. Instructions for Reproducing Results in Our Work" section in the paper, this section describles how to reproduce the experiments by running trial.py. But I can't find trial.py file in this repository.

    It would be appreciated if anyone could help tell me where to find the file.

  • fixed add_edge function, added test_example.py

    fixed add_edge function, added test_example.py

    Thanks for opensourcing such good contribution. I wrote a test code like this:

    import torch
    import GCL.augmentors as A
    aug=A.Compose([A.EdgeAdding(pe=0.4),A.FeatureDropout(pf=0.5)])
    edge_index=torch.randint(0,10,(2,10)) 
    x=torch.randn((10,128))
    auged=aug(x,edge_index)
    print(auged)
    

    and it got this:

    num_edges = edge_index.size()[1]
    IndexError: tuple index out of range
    

    After I checked the code in add_edge function in functional.py, there could be a little problem with these piece of code:

    def add_edge(edge_index: torch.Tensor, ratio: float) -> torch.Tensor:
        num_edges = edge_index.size()[1]
        num_nodes = edge_index.max().item() + 1
        num_add = int(num_edges * ratio)
    
        new_edge_index = torch.randint(0, num_nodes - 1, size=(2, num_add)).to(edge_index.device)
        edge_index = torch.cat([edge_index, new_edge_index], dim=1)
        
        # here could be wrongly written. [0] might be removed.
        edge_index = sort_edge_index(edge_index)[0]
    
        return coalesce_edge_index(edge_index)[0]
    
    

    If we keep [0] in the code, only the first array of edge index would be extracted, and it's not what we wanted.

    Don't know if this is an issue here?

  • The augmentation

    The augmentation "Node Dropping"

    Hi, I‘m curious about the augmentation "Dropping Node", I find both of your implementation and the code published by the author of GraphCL just isolated the selected nodes but don't remove the selected nodes from the node feature matrix. In this situation, when we do the graph classification task and use some operations like summation, the isolated nodes will still have an impact on the final learned representation. So, shouldn't we remove the selected nodes from the feature matrix or this is a standard for graph augmentation?

    Screen Shot 2022-04-24 at 9 51 40 PM
  • The augmentation

    The augmentation "add edge"

    Hi! I found that the augmentation "add edge" may not support for the mini-batch graph-level contrastive learning. In my understanding, edges should be added for each graph in the mini-batch case. Could you please check this?

  • issue caused by ordered augmentor

    issue caused by ordered augmentor

    Just to bring a little problem:

    Augmentor.Compose([aug1, aug2,...]) combines all aug operations in order, yet some of the ops are opposite themselves. For example if we applied EdgeRemoving and EdgeAdding some edges may be removed and then added thus two operations might cancel each other.

    Although this problem might not appear very often, but it does exist.

  • Selling our new paper to the library

    Selling our new paper to the library

    Dear authors,

    Many thanks for your great efforts in implementation. We would like to sell our new algorithm JOAO https://arxiv.org/abs/2106.07594 (icml'21), with the purpose of automating graphcl (augmentation selection), to your repo.

This is the repository for the AAAI 21 paper [Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning].

CG3 This is the repository for the AAAI 21 paper [Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning]. R

Feb 16, 2022
Re-implementation of the Noise Contrastive Estimation algorithm for pyTorch, following "Noise-contrastive estimation: A new estimation principle for unnormalized statistical models." (Gutmann and Hyvarinen, AISTATS 2010)

Noise Contrastive Estimation for pyTorch Overview This repository contains a re-implementation of the Noise Contrastive Estimation algorithm, implemen

Nov 14, 2021
May 13, 2022
A PyTorch implementation of "Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning", IJCAI-21
A PyTorch implementation of

MERIT A PyTorch implementation of our IJCAI-21 paper Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning. Depen

Apr 29, 2022
A PyTorch implementation of "ANEMONE: Graph Anomaly Detection with Multi-Scale Contrastive Learning", CIKM-21
A PyTorch implementation of

ANEMONE A PyTorch implementation of "ANEMONE: Graph Anomaly Detection with Multi-Scale Contrastive Learning", CIKM-21 Dependencies python==3.6.1 dgl==

Mar 1, 2022
The PyTorch implementation of Directed Graph Contrastive Learning (DiGCL), NeurIPS-2021

Directed Graph Contrastive Learning The PyTorch implementation of Directed Graph Contrastive Learning (DiGCL). In this paper, we present the first con

Mar 12, 2022
This is an open-source toolkit for Heterogeneous Graph Neural Network(OpenHGNN) based on DGL [Deep Graph Library] and PyTorch.

This is an open-source toolkit for Heterogeneous Graph Neural Network(OpenHGNN) based on DGL [Deep Graph Library] and PyTorch.

May 16, 2022
Code for CoMatch: Semi-supervised Learning with Contrastive Graph Regularization

CoMatch: Semi-supervised Learning with Contrastive Graph Regularization (Salesforce Research) This is a PyTorch implementation of the CoMatch paper [B

May 20, 2022
[WWW 2021] Source code for "Graph Contrastive Learning with Adaptive Augmentation"

GCA Source code for Graph Contrastive Learning with Adaptive Augmentation (WWW 2021) For example, to run GCA-Degree under WikiCS, execute: python trai

May 19, 2022
[ICML 2021] "Graph Contrastive Learning Automated" by Yuning You, Tianlong Chen, Yang Shen, Zhangyang Wang
[ICML 2021]

Graph Contrastive Learning Automated PyTorch implementation for Graph Contrastive Learning Automated [talk] [poster] [appendix] Yuning You, Tianlong C

Apr 13, 2022
This is the repository for the NeurIPS-21 paper [Contrastive Graph Poisson Networks: Semi-Supervised Learning with Extremely Limited Labels].

CGPN This is the repository for the NeurIPS-21 paper [Contrastive Graph Poisson Networks: Semi-Supervised Learning with Extremely Limited Labels]. Req

Apr 22, 2022
pytorch implementation of "Contrastive Multiview Coding", "Momentum Contrast for Unsupervised Visual Representation Learning", and "Unsupervised Feature Learning via Non-Parametric Instance-level Discrimination"
pytorch implementation of

Unofficial implementation: MoCo: Momentum Contrast for Unsupervised Visual Representation Learning (Paper) InsDis: Unsupervised Feature Learning via N

Nov 4, 2020
A static analysis library for computing graph representations of Python programs suitable for use with graph neural networks.

python_graphs This package is for computing graph representations of Python programs for machine learning applications. It includes the following modu

Apr 22, 2022
Pytorch implementation of “Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement”

Graph-to-Graph Transformers Self-attention models, such as Transformer, have been hugely successful in a wide range of natural language processing (NL

Apr 15, 2022
A PyTorch implementation of "Semi-Supervised Graph Classification: A Hierarchical Graph Perspective" (WWW 2019)
A PyTorch implementation of

SEAL ⠀⠀⠀ A PyTorch implementation of Semi-Supervised Graph Classification: A Hierarchical Graph Perspective (WWW 2019) Abstract Node classification an

Apr 21, 2022
Apply Graph Self-Supervised Learning methods to graph-level task(TUDataset, MolculeNet Datset)

Graphlevel-SSL Overview Apply Graph Self-Supervised Learning methods to graph-level task(TUDataset, MolculeNet Dataset). It is unified framework to co

Oct 15, 2021
Some tentative models that incorporate label propagation to graph neural networks for graph representation learning in nodes, links or graphs.

Some tentative models that incorporate label propagation to graph neural networks for graph representation learning in nodes, links or graphs.

Nov 18, 2021
PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)
PyTorch implementation of

PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)

May 21, 2022