Python library for machine learning on graphs
Project description
StellarGraph Machine Learning Library
StellarGraph is a Python library for machine learning on graphs and networks.
Table of Contents
Introduction
The StellarGraph library offers state-of-the-art algorithms for graph machine learning, making it easy to discover patterns and answer questions about graph-structured data. It can solve many machine learning tasks:
- Representation learning for nodes and edges, to be used for visualisation and various downstream machine learning tasks;
- Classification and attribute inference of nodes or edges;
- Classification of whole graphs;
- Link prediction;
- Interpretation of node classification [8].
Graph-structured data represent entities as nodes (or vertices) and relationships between them as edges (or links), and can include data associated with either as attributes. For example, a graph can contain people as nodes and friendships between them as links, with data like a person's age and the date a friendship was established. StellarGraph supports analysis of many kinds of graphs:
- homogeneous (with nodes and links of one type),
- heterogeneous (with more than one type of nodes and/or links)
- knowledge graphs (extreme heterogeneous graphs with thousands of types of edges)
- graphs with or without data associated with nodes
- graphs with edge weights
StellarGraph is built on TensorFlow 2 and its Keras high-level API, as well as Pandas and NumPy. It is thus user-friendly, modular and extensible. It interoperates smoothly with code that builds on these, such as the standard Keras layers and scikit-learn, so it is easy to augment the core graph machine learning algorithms provided by StellarGraph. It is thus also easy to install with pip
or Anaconda.
Getting Started
The numerous detailed and narrated examples are a good way to get started with StellarGraph. There is likely to be one that is similar to your data or your problem (if not, let us know).
You can start working with the examples immediately in Google Colab or Binder by clicking the and badges within each Jupyter notebook.
Alternatively, you can run download a local copy of the demos and run them using jupyter
. The demos can be downloaded by cloning the master
branch of this repository, or by using the curl
command below:
curl -L https://github.com/stellargraph/stellargraph/archive/master.zip | tar -xz --strip=1 stellargraph-master/demos
The dependencies required to run most of our demo notebooks locally can be installed using one of the following:
- Using
pip
:pip install stellargraph[demos]
- Using
conda
:conda install -c stellargraph stellargraph
(See Installation section for more details and more options.)
Getting Help
If you get stuck or have a problem, there's many ways to make progress and get help or support:
- Read the documentation
- Consult the examples
- Contact us:
Example: GCN
One of the earliest deep machine learning algorithms for graphs is a Graph Convolution Network (GCN) [6]. The following example uses it for node classification: predicting the class from which a node comes. It shows how easy it is to apply using StellarGraph, and shows how StellarGraph integrates smoothly with Pandas and TensorFlow and libraries built on them.
Data preparation
Data for StellarGraph can be prepared using common libraries like Pandas and scikit-learn.
import pandas as pd
from sklearn import model_selection
def load_my_data():
# your own code to load data into Pandas DataFrames, e.g. from CSV files or a database
...
nodes, edges, targets = load_my_data()
# Use scikit-learn to compute training and test sets
train_targets, test_targets = model_selection.train_test_split(targets, train_size=0.5)
Graph machine learning model
This is the only part that is specific to StellarGraph. The machine learning model consists of some graph convolution layers followed by a layer to compute the actual predictions as a TensorFlow tensor. StellarGraph makes it easy to construct all of these layers via the GCN
model class. It also makes it easy to get input data in the right format via the StellarGraph
graph data type and a data generator.
import stellargraph as sg
import tensorflow as tf
# convert the raw data into StellarGraph's graph format for faster operations
graph = sg.StellarGraph(nodes, edges)
generator = sg.mapper.FullBatchNodeGenerator(graph, method="gcn")
# two layers of GCN, each with hidden dimension 16
gcn = sg.layer.GCN(layer_sizes=[16, 16], generator=generator)
x_inp, x_out = gcn.in_out_tensors() # create the input and output TensorFlow tensors
# use TensorFlow Keras to add a layer to compute the (one-hot) predictions
predictions = tf.keras.layers.Dense(units=len(ground_truth_targets.columns), activation="softmax")(x_out)
# use the input and output tensors to create a TensorFlow Keras model
model = tf.keras.Model(inputs=x_inp, outputs=predictions)
Training and evaluation
The model is a conventional TensorFlow Keras model, and so tasks such as training and evaluation can use the functions offered by Keras. StellarGraph's data generators make it simple to construct the required Keras Sequences for input data.
# prepare the model for training with the Adam optimiser and an appropriate loss function
model.compile("adam", loss="categorical_crossentropy", metrics=["accuracy"])
# train the model on the train set
model.fit(generator.flow(train_targets.index, train_targets), epochs=5)
# check model generalisation on the test set
(loss, accuracy) = model.evaluate(generator.flow(test_targets.index, test_targets))
print(f"Test set: loss = {loss}, accuracy = {accuracy}")
This algorithm is spelled out in more detail in its extended narrated notebook. We provide many more algorithms, each with a detailed example.
Algorithms
The StellarGraph library currently includes the following algorithms for graph machine learning:
Algorithm | Description |
---|---|
GraphSAGE [1] | Supports supervised as well as unsupervised representation learning, node classification/regression, and link prediction for homogeneous networks. The current implementation supports multiple aggregation methods, including mean, maxpool, meanpool, and attentional aggregators. |
HinSAGE | Extension of GraphSAGE algorithm to heterogeneous networks. Supports representation learning, node classification/regression, and link prediction/regression for heterogeneous graphs. The current implementation supports mean aggregation of neighbour nodes, taking into account their types and the types of links between them. |
attri2vec [4] | Supports node representation learning, node classification, and out-of-sample node link prediction for homogeneous graphs with node attributes. |
Graph ATtention Network (GAT) [5] | The GAT algorithm supports representation learning and node classification for homogeneous graphs. There are versions of the graph attention layer that support both sparse and dense adjacency matrices. |
Graph Convolutional Network (GCN) [6] | The GCN algorithm supports representation learning and node classification for homogeneous graphs. There are versions of the graph convolutional layer that support both sparse and dense adjacency matrices. |
Cluster Graph Convolutional Network (Cluster-GCN) [10] | An extension of the GCN algorithm supporting representation learning and node classification for homogeneous graphs. Cluster-GCN scales to larger graphs and can be used to train deeper GCN models using Stochastic Gradient Descent. |
Simplified Graph Convolutional network (SGC) [7] | The SGC network algorithm supports representation learning and node classification for homogeneous graphs. It is an extension of the GCN algorithm that smooths the graph to bring in more distant neighbours of nodes without using multiple layers. |
(Approximate) Personalized Propagation of Neural Predictions (PPNP/APPNP) [9] | The (A)PPNP algorithm supports fast and scalable representation learning and node classification for attributed homogeneous graphs. In a semi-supervised setting, first a multilayer neural network is trained using the node attributes as input. The predictions from the latter network are then diffused across the graph using a method based on Personalized PageRank. |
Node2Vec [2] | The Node2Vec and Deepwalk algorithms perform unsupervised representation learning for homogeneous networks, taking into account network structure while ignoring node attributes. The node2vec algorithm is implemented by combining StellarGraph's random walk generator with the word2vec algorithm from Gensim. Learned node representations can be used in downstream machine learning models implemented using Scikit-learn, Keras, TensorFlow or any other Python machine learning library. |
Metapath2Vec [3] | The metapath2vec algorithm performs unsupervised, metapath-guided representation learning for heterogeneous networks, taking into account network structure while ignoring node attributes. The implementation combines StellarGraph's metapath-guided random walk generator and Gensim word2vec algorithm. As with node2vec, the learned node representations (node embeddings) can be used in downstream machine learning models to solve tasks such as node classification, link prediction, etc, for heterogeneous networks. |
Relational Graph Convolutional Network [11] | The RGCN algorithm performs semi-supervised learning for node representation and node classification on knowledge graphs. RGCN extends GCN to directed graphs with multiple edge types and works with both sparse and dense adjacency matrices. |
ComplEx[12] | The ComplEx algorithm computes embeddings for nodes (entities) and edge types (relations) in knowledge graphs, and can use these for link prediction |
GraphWave [13] | GraphWave calculates unsupervised structural embeddings via wavelet diffusion through the graph. |
Supervised Graph Classification | A model for supervised graph classification based on GCN [6] layers and mean pooling readout. |
Watch Your Step [14] | The Watch Your Step algorithm computes node embeddings by using adjacency powers to simulate expected random walks. |
Deep Graph Infomax [15] | Deep Graph Infomax trains unsupervised GNNs to maximize the shared information between node level and graph level features. |
Continuous-Time Dynamic Network Embeddings (CTDNE) [16] | Supports time-respecting random walks which can be used in a similar way as in Node2Vec for unsupervised representation learning. |
DistMult [17] | The DistMult algorithm computes embeddings for nodes (entities) and edge types (relations) in knowledge graphs, and can use these for link prediction |
DGCNN [18] | The Deep Graph Convolutional Neural Network (DGCNN) algorithm for supervised graph classification. |
TGCN [19] | The GCN_LSTM model in StellarGraph follows the Temporal Graph Convolutional Network architecture proposed in the TGCN paper with a few enhancements in the layers architecture. |
Installation
StellarGraph is a Python 3 library and we recommend using Python version 3.6
. The required Python version
can be downloaded and installed from python.org. Alternatively, use the Anaconda Python
environment, available from anaconda.com.
The StellarGraph library can be installed from PyPI, from Anaconda Cloud, or directly from GitHub, as described below.
Install StellarGraph using PyPI:
To install StellarGraph library from PyPI using pip
, execute the following command:
pip install stellargraph
Some of the examples require installing additional dependencies as well as stellargraph
. To install these dependencies as well as StellarGraph using pip
execute the following command:
pip install stellargraph[demos]
The community detection demos requires python-igraph
which is only available on some platforms. To install this in addition to the other demo requirements:
pip install stellargraph[demos,igraph]
Install StellarGraph in Anaconda Python:
The StellarGraph library is available an Anaconda Cloud and can be installed in Anaconda Python using the command line conda
tool, execute the following command:
conda install -c stellargraph stellargraph
Install StellarGraph from GitHub source:
First, clone the StellarGraph repository using git
:
git clone https://github.com/stellargraph/stellargraph.git
Then, cd
to the StellarGraph folder, and install the library by executing the following commands:
cd stellargraph
pip install .
Some of the examples in the demos
directory require installing additional dependencies as well as stellargraph
. To install these dependencies as well as StellarGraph using pip
execute the following command:
pip install .[demos]
Citing
StellarGraph is designed, developed and supported by CSIRO's Data61. If you use any part of this library in your research, please cite it using the following BibTex entry
@misc{StellarGraph,
author = {CSIRO's Data61},
title = {StellarGraph Machine Learning Library},
year = {2018},
publisher = {GitHub},
journal = {GitHub Repository},
howpublished = {\url{https://github.com/stellargraph/stellargraph}},
}
References
-
Inductive Representation Learning on Large Graphs. W.L. Hamilton, R. Ying, and J. Leskovec. Neural Information Processing Systems (NIPS), 2017. (link webpage)
-
Node2Vec: Scalable Feature Learning for Networks. A. Grover, J. Leskovec. ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), 2016. (link)
-
Metapath2Vec: Scalable Representation Learning for Heterogeneous Networks. Yuxiao Dong, Nitesh V. Chawla, and Ananthram Swami. ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), 135–144, 2017 (link)
-
Attributed Network Embedding via Subspace Discovery. D. Zhang, Y. Jie, X. Zhu and C. Zhang, Data Mining and Knowledge Discovery, 2019. (link)
-
Graph Attention Networks. P. Veličković et al. International Conference on Learning Representations (ICLR) 2018 (link)
-
Graph Convolutional Networks (GCN): Semi-Supervised Classification with Graph Convolutional Networks. Thomas N. Kipf, Max Welling. International Conference on Learning Representations (ICLR), 2017 (link)
-
Simplifying Graph Convolutional Networks. F. Wu, T. Zhang, A. H. de Souza, C. Fifty, T. Yu, and K. Q. Weinberger. International Conference on Machine Learning (ICML), 2019. (link)
-
Adversarial Examples on Graph Data: Deep Insights into Attack and Defense. H. Wu, C. Wang, Y. Tyshetskiy, A. Docherty, K. Lu, and L. Zhu. IJCAI 2019. (link)
-
Predict then propagate: Graph neural networks meet personalized PageRank. J. Klicpera, A. Bojchevski, A., and S. Günnemann, ICLR, 2019, arXiv:1810.05997.(link)
-
Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks. W. Chiang, X. Liu, S. Si, Y. Li, S. Bengio, and C. Hsiej, KDD, 2019, arXiv:1905.07953.(link)
-
Modeling relational data with graph convolutional networks. M. Schlichtkrull, T. N. Kipf, P. Bloem, R. Van Den Berg, I. Titov, and M. Welling, European Semantic Web Conference (2018), arXiv:1609.02907 (link).
-
Complex Embeddings for Simple Link Prediction. T. Trouillon, J. Welbl, S. Riedel, É. Gaussier and G. Bouchard, ICML 2016. (link)
-
Learning Structural Node Embeddings via Diffusion Wavelets. C. Donnat, M. Zitnik, D. Hallac, and J. Leskovec, SIGKDD, 2018, arXiv:1710.10321 (link)
-
Watch Your Step: Learning Node Embeddings via Graph Attention. S. Abu-El-Haija, B. Perozzi, R. Al-Rfou and A. Alemi, NIPS, 2018. arXiv:1710.09599 (link)
-
Deep Graph Infomax. P. Veličković, W. Fedus, W. L. Hamilton, P. Lio, Y. Bengio, R. D. Hjelm, ICLR, 2019, arXiv:1809.10341 (link).
-
Continuous-Time Dynamic Network Embeddings. Giang Hoang Nguyen, John Boaz Lee, Ryan A. Rossi, Nesreen K. Ahmed, Eunyee Koh, and Sungchul Kim. Proceedings of the 3rd International Workshop on Learning Representations for Big Networks (WWW BigNet) 2018. (link)
-
Embedding Entities and Relations for Learning and Inference in Knowledge Bases. Bishan Yang, Wen-tau Yih, Xiaodong He, Jianfeng Gao, and Li Deng, ICLR, 2015. arXiv:1412.6575 (link)
-
An End-to-End Deep Learning Architecture for Graph Classification. Muhan Zhang, Zhicheng Cui, Marion Neumann, and Yixin Chen, AAAI, 2018. (link)
-
T-GCN: A Temporal Graph Convolutional Network for Traffic Prediction. Ling Zhao, Yujiao Song, Chao Zhang, Yu Liu, Pu Wang, Tao Lin, Min Deng, and Haifeng Li.IEEE Transactions on Intelligent Transportation Systems, 2019. (link)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file stellargraph-1.2.1.tar.gz
.
File metadata
- Download URL: stellargraph-1.2.1.tar.gz
- Upload date:
- Size: 324.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.6.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 804200a7992465bb5f10da95a3b7748014b0a159f1e79a95517c12a9ff2faed2 |
|
MD5 | ffca66c163da25a6e8871fdb1fdb4474 |
|
BLAKE2b-256 | 06a230c06aa04fbf8d4eecdfdbc970a7fcaf25d1a11b87fc521a8c33831a93a4 |
File details
Details for the file stellargraph-1.2.1-py3-none-any.whl
.
File metadata
- Download URL: stellargraph-1.2.1-py3-none-any.whl
- Upload date:
- Size: 435.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.6.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 984a839de4645c9703b8f34a065307498feef38e5160a8f9d7764318a3d4621d |
|
MD5 | f25e9f884c3f6296081dacf64479a975 |
|
BLAKE2b-256 | 747816b23ef04cf6fb24a7dea9fd0e03c8308a56681cc5efe29f16186210ba04 |