A parallel backend for NetworkX. It uses joblib to run NetworkX algorithms on multiple CPU cores.
Project description
nx-parallel
nx-parallel is a NetworkX backend that uses joblib for parallelization. This project aims to provide parallelized implementations of various NetworkX functions to improve performance. Refer NetworkX backends documentation to learn more about the backend architecture in NetworkX.
Algorithms in nx-parallel
- all_pairs_all_shortest_paths
- all_pairs_bellman_ford_path
- all_pairs_bellman_ford_path_length
- all_pairs_dijkstra
- all_pairs_dijkstra_path
- all_pairs_dijkstra_path_length
- all_pairs_node_connectivity
- all_pairs_shortest_path
- all_pairs_shortest_path_length
- approximate_all_pairs_node_connectivity
- betweenness_centrality
- closeness_vitality
- edge_betweenness_centrality
- is_reachable
- johnson
- local_efficiency
- node_redundancy
- number_of_isolates
- square_clustering
- tournament_is_strongly_connected
Script used to generate the above list
import _nx_parallel as nxp
d = nxp.get_funcs_info() # temporarily add `from .update_get_info import *` to _nx_parallel/__init__.py
for func in d:
print(f"- [{func}]({d[func]['url']})")
Installation
It is recommended to first refer the NetworkX's INSTALL.rst. nx-parallel requires Python >=3.11. Right now, the only dependencies of nx-parallel are networkx and joblib.
Installing nx-parallel using pip
You can install the stable version of nx-parallel using pip:
pip install nx-parallel
The above command also installs the two main dependencies of nx-parallel i.e. networkx
and joblib. To upgrade to a newer release use the --upgrade
flag:
pip install --upgrade nx-parallel
Installing the development version
Before installing the development version, you may need to uninstall the
standard version of nx-parallel
and other two dependencies using pip
:
pip uninstall nx-parallel networkx joblib
Then do:
pip install git+https://github.com/networkx/nx-parallel.git@main
Installing nx-parallel with conda
Installing nx-parallel
from the conda-forge
channel can be achieved by adding conda-forge
to your channels with:
conda config --add channels conda-forge
conda config --set channel_priority strict
Once the conda-forge
channel has been enabled, nx-parallel
can be installed with conda
:
conda install nx-parallel
or with mamba
:
mamba install nx-parallel
Backend usage
You can run your networkx code by just setting the NETWORKX_AUTOMATIC_BACKENDS
environment variable to parallel
:
export NETWORKX_AUTOMATIC_BACKENDS=parallel && python nx_code.py
Note that for all functions inside nx_code.py
that do not have an nx-parallel implementation their original networkx implementation will be executed. You can also use the nx-parallel backend in your code for only some specific function calls in the following ways:
import networkx as nx
import nx_parallel as nxp
# enabling networkx's config for nx-parallel
nx.config.backends.parallel.active = True
# setting `n_jobs` (by default, `n_jobs=None`)
nx.config.backends.parallel.n_jobs = 4
G = nx.path_graph(4)
H = nxp.ParallelGraph(G)
# method 1 : passing ParallelGraph object in networkx function (Type-based dispatching)
nx.betweenness_centrality(H)
# method 2 : using the 'backend' kwarg
nx.betweenness_centrality(G, backend="parallel")
# method 3 : using nx-parallel implementation with networkx object
nxp.betweenness_centrality(G)
# method 4 : using nx-parallel implementation with ParallelGraph object
nxp.betweenness_centrality(H)
For more on how to play with configurations in nx-parallel refer the Config.md! Additionally, refer the NetworkX's official backend and config docs for more on functionalities provided by networkx for backends and configs like logging, backend_priority
, etc. Another way to configure nx-parallel is by using joblib.parallel_config
.
Notes
-
Some functions in networkx have the same name but different implementations, so to avoid these name conflicts at the time of dispatching networkx differentiates them by specifying the
name
parameter in the_dispatchable
decorator of such algorithms. So,method 3
andmethod 4
are not recommended. But, you can use them if you know the correctname
. For example:# using `name` parameter - nx-parallel as an independent package # run the parallel implementation in `connectivity/connectivity` nxp.all_pairs_node_connectivity(H) # runs the parallel implementation in `approximation/connectivity` nxp.approximate_all_pairs_node_connectivity(H)
Also, if you are using nx-parallel as a backend then mentioning the subpackage to which the algorithm belongs is recommended to ensure that networkx dispatches to the correct implementation. For example:
# with subpackage - nx-parallel as a backend nx.all_pairs_node_connectivity(H) nx.approximation.all_pairs_node_connectivity(H)
-
Right now there isn't much difference between
nx.Graph
andnxp.ParallelGraph
somethod 3
would work fine but it is not recommended because in the future that might not be the case.
Feel free to contribute to nx-parallel. You can find the contributing guidelines here. If you'd like to implement a feature or fix a bug, we'd be happy to review a pull request. Please make sure to explain the changes you made in the pull request description. And feel free to open issues for any problems you face, or for new features you'd like to see implemented.
This project is managed under the NetworkX organisation, so the code of conduct of NetworkX applies here as well.
All code in this repository is available under the Berkeley Software Distribution (BSD) 3-Clause License (see LICENSE).
Thank you :)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file nx_parallel-0.3.tar.gz
.
File metadata
- Download URL: nx_parallel-0.3.tar.gz
- Upload date:
- Size: 19.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f3b98323f9e160a15698d6471eee59eb90a0356b2da8a9ddd80818445afc0d04 |
|
MD5 | 51a1bc479c3868755bed682587a308f1 |
|
BLAKE2b-256 | ac862050e23cfe6b872d3a42eefd52db62a0eef6283231a850a1b3ed3348b1dd |
Provenance
File details
Details for the file nx_parallel-0.3-py3-none-any.whl
.
File metadata
- Download URL: nx_parallel-0.3-py3-none-any.whl
- Upload date:
- Size: 28.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 40b5c36504624d7485e5eb487563f6e6d3c18f5ad270a458d1a2ce7c2da4490c |
|
MD5 | 8745dfd02e985279b3a39c9a6c73fd7a |
|
BLAKE2b-256 | e718b615e3c081972e2dedaae9554bfdf959dc9148062da90d522e080ba471b8 |