Skip to main content

Fast and Customizable Tokenizers

Project description



Build GitHub


Tokenizers

Provides an implementation of today's most used tokenizers, with a focus on performance and versatility.

Bindings over the Rust implementation. If you are interested in the High-level design, you can go check it there.

Otherwise, let's dive in!

Main features:

  • Train new vocabularies and tokenize using 4 pre-made tokenizers (Bert WordPiece and the 3 most common BPE versions).
  • Extremely fast (both training and tokenization), thanks to the Rust implementation. Takes less than 20 seconds to tokenize a GB of text on a server's CPU.
  • Easy to use, but also extremely versatile.
  • Designed for research and production.
  • Normalization comes with alignments tracking. It's always possible to get the part of the original sentence that corresponds to a given token.
  • Does all the pre-processing: Truncate, Pad, add the special tokens your model needs.

Installation

With pip:

pip install tokenizers

From sources:

To use this method, you need to have the Rust installed:

# Install with:
curl https://sh.rustup.rs -sSf | sh -s -- -y
export PATH="$HOME/.cargo/bin:$PATH"

Once Rust is installed, you can compile doing the following

git clone https://github.com/huggingface/tokenizers
cd tokenizers/bindings/python

# Create a virtual env (you can use yours as well)
python -m venv .env
source .env/bin/activate

# Install `tokenizers` in the current virtual env
pip install setuptools_rust
python setup.py install

Using the provided Tokenizers

Using a pre-trained tokenizer is really simple:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
tokenizer = CharBPETokenizer(vocab, merges)

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded.ids)
print(encoded.tokens)

And you can train yours just as simply:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
tokenizer = CharBPETokenizer()

# Then train it!
tokenizer.train([ "./path/to/files/1.txt", "./path/to/files/2.txt" ])

# And you can use it
encoded = tokenizer.encode("I can feel the magic, can you?")

# And finally save it somewhere
tokenizer.save("./path/to/directory", "my-bpe")

Provided Tokenizers

  • CharBPETokenizer: The original BPE
  • ByteLevelBPETokenizer: The byte level version of the BPE
  • SentencePieceBPETokenizer: A BPE implementation compatible with the one used by SentencePiece
  • BertWordPieceTokenizer: The famous Bert tokenizer, using WordPiece

All of these can be used and trained as explained above!

Build your own

You can also easily build your own tokenizers, by putting all the different parts you need together:

Use a pre-trained tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, processors

# Load a BPE Model
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
bpe = models.BPE.from_files(vocab, merges)

# Initialize a tokenizer
tokenizer = Tokenizer(bpe)

# Customize pre-tokenization and decoding
tokenizer.pre_tokenizer = pre_tokenizers.ByteLevel(add_prefix_space=True)
tokenizer.decoder = decoders.ByteLevel()
tokenizer.post_processor = processors.ByteLevel(trim_offsets=True)

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded.ids)
print(encoded.tokens)

# Or tokenize multiple sentences at once:
encoded = tokenizer.encode_batch([
	"I can feel the magic, can you?",
	"The quick brown fox jumps over the lazy dog"
])
print(encoded)

Train a new tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, trainers, processors

# Initialize a tokenizer
tokenizer = Tokenizer(models.BPE.empty())

# Customize pre-tokenization and decoding
tokenizer.pre_tokenizer = pre_tokenizers.ByteLevel(add_prefix_space=True)
tokenizer.decoder = decoders.ByteLevel()
tokenizer.post_processor = processors.ByteLevel(trim_offsets=True)

# And then train
trainer = trainers.BpeTrainer(vocab_size=20000, min_frequency=2)
tokenizer.train(trainer, [
	"./path/to/dataset/1.txt",
	"./path/to/dataset/2.txt",
	"./path/to/dataset/3.txt"
])

# Now we can encode
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenizers-0.7.0rc1.tar.gz (78.8 kB view details)

Uploaded Source

Built Distributions

tokenizers-0.7.0rc1-cp38-cp38-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.8 Windows x86-64

tokenizers-0.7.0rc1-cp38-cp38-win32.whl (945.3 kB view details)

Uploaded CPython 3.8 Windows x86

tokenizers-0.7.0rc1-cp38-cp38-manylinux1_x86_64.whl (7.4 MB view details)

Uploaded CPython 3.8

tokenizers-0.7.0rc1-cp38-cp38-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.8 macOS 10.10+ x86-64

tokenizers-0.7.0rc1-cp37-cp37m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.7m Windows x86-64

tokenizers-0.7.0rc1-cp37-cp37m-win32.whl (944.8 kB view details)

Uploaded CPython 3.7m Windows x86

tokenizers-0.7.0rc1-cp37-cp37m-manylinux1_x86_64.whl (5.5 MB view details)

Uploaded CPython 3.7m

tokenizers-0.7.0rc1-cp37-cp37m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.7m macOS 10.10+ x86-64

tokenizers-0.7.0rc1-cp36-cp36m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.6m Windows x86-64

tokenizers-0.7.0rc1-cp36-cp36m-win32.whl (945.2 kB view details)

Uploaded CPython 3.6m Windows x86

tokenizers-0.7.0rc1-cp36-cp36m-manylinux1_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.6m

tokenizers-0.7.0rc1-cp36-cp36m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.6m macOS 10.10+ x86-64

tokenizers-0.7.0rc1-cp35-cp35m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.5m Windows x86-64

tokenizers-0.7.0rc1-cp35-cp35m-win32.whl (945.2 kB view details)

Uploaded CPython 3.5m Windows x86

tokenizers-0.7.0rc1-cp35-cp35m-manylinux1_x86_64.whl (1.9 MB view details)

Uploaded CPython 3.5m

tokenizers-0.7.0rc1-cp35-cp35m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.5m macOS 10.10+ x86-64

File details

Details for the file tokenizers-0.7.0rc1.tar.gz.

File metadata

  • Download URL: tokenizers-0.7.0rc1.tar.gz
  • Upload date:
  • Size: 78.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1.tar.gz
Algorithm Hash digest
SHA256 bbd8c5b13939dd3b7a9a1a9971db8ace548f46604653ba4af3438fd9aa525700
MD5 9ac6074dc06c4303db0dea60ce8d2d87
BLAKE2b-256 8e8f20ae89cc82f133809d497a7176bac49086813fe639ea7a77e7cf565f0750

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 7b4976afc461e3e45cf818bb9264a13ead613708a762a274497c5d073074e870
MD5 90d8d68d41bded85777284d3c76249e2
BLAKE2b-256 5db766ddf0c5391feb317f550717681022daca6c14f76d9a95d61603992db30e

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp38-cp38-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp38-cp38-win32.whl
  • Upload date:
  • Size: 945.3 kB
  • Tags: CPython 3.8, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp38-cp38-win32.whl
Algorithm Hash digest
SHA256 2b7b27820dc6f7609930ecd6e1a646200b9e2ab55e39ab83ccfae104f8ed9b36
MD5 ccf1514a1e4b68511d0352cabaac3936
BLAKE2b-256 ddac6df416904315b22394cb33335c5f9d1071042d71134ab9ece75187ce9f6d

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 7.4 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 0020c1f324a8c56453dcdf136d59a0e524b900149f77a31175f84eccbee74591
MD5 92ae9acbc85313f537982ead2d00f82b
BLAKE2b-256 e13c3056ebd319f0b8e947e11d63f2fa23a614a8fdb06216e1cf4f8b8cb5a8ae

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp38-cp38-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp38-cp38-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.8, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp38-cp38-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 e520bb06e1fea6431a07bc53c49a2409f579b32d3d82726c6a162cd37058f1f7
MD5 e792d294d53b1465f1b59b7a448a79e3
BLAKE2b-256 9e64214ab71498100ecf973602b4cd5341bd353b711b33bbc519acaf05b3b476

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 fc7431ee98e5b935ce7d29afd49d863ccd0e60b45cffa5f4851abf7112d84031
MD5 7a403235630649e8d0a015108f42dd6f
BLAKE2b-256 0a555885acda31e3b32bc17f58182b960f62c42c7d2ad5ffdec0f2ec9bcdb103

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp37-cp37m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp37-cp37m-win32.whl
  • Upload date:
  • Size: 944.8 kB
  • Tags: CPython 3.7m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp37-cp37m-win32.whl
Algorithm Hash digest
SHA256 9ca180c3d45b9f5661e3d927c40b9e37f87306bcaa6b5380b5198020708e34a6
MD5 b3fed9b8255b34ea048e2a352a33d1e9
BLAKE2b-256 ff4a73b47ff2e08ee6e1f060aa110a717b0a28f712f6bc3ba4aa38159509a3a6

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 5.5 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 d43b3b70d7fc86b0895ee4558ad789974189ec195b24d363cfdcbd7f9e051d2a
MD5 17219621eb2dc7901867e9a04520d4ab
BLAKE2b-256 fb040f1bee8f0e3e96e99850b8ea410dfcc2d2fd13b0b683cf359df03c7840f7

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp37-cp37m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp37-cp37m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.7m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp37-cp37m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 733f23dda6d518d8a24fe739d32fac60739dbb423c6c93d1c586ecefc6866b95
MD5 9c24819041c86a35a4c8ebd03a431453
BLAKE2b-256 8f034b0b495947333226bd9fb91969a422ec843ab3993ad013985c761c624c9f

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 7dd13b8e60efbb2e42509fb2e8f942a3e3e81d72a71ad6f9fbb70d869930d648
MD5 0797042638a2decc4cecca46004cbb7f
BLAKE2b-256 691f632ca6259fe5078e9d139c5d72f94107dfe8cf470e1994543387f8faf563

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp36-cp36m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp36-cp36m-win32.whl
  • Upload date:
  • Size: 945.2 kB
  • Tags: CPython 3.6m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp36-cp36m-win32.whl
Algorithm Hash digest
SHA256 f6ae4daa67143d726a340f6b46f8324c6661c14a7ab445f9ae7dd5eda16ad240
MD5 08dbe153d950c20c5c748a4802fd5151
BLAKE2b-256 82a19b8a823f79db9bede1877cf63edbaffd886bbc9c52a66050d37dbc87c632

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.7 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 a94927bc7dd3dfdfd15a15683380272eb7ca1f91ab6bbe55af7d07eff07155e8
MD5 c6e5f80bd1d21efcbbf2e93d889b3f4a
BLAKE2b-256 93b31f779912c9407595a31437b37572ef3a5f4643bfb603f56140dac16b8a20

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp36-cp36m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp36-cp36m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.6m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp36-cp36m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 4537f970ae4d29301eade221cd8d65c66e62451290a61d0314138f6ff2b6886c
MD5 7c22e843fd444e7853fc452b18cf1192
BLAKE2b-256 f041761a21cd52f30772c3dfd627bdc782d74c5b5442b6893c27fde6ac1ff93f

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp35-cp35m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp35-cp35m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.5m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp35-cp35m-win_amd64.whl
Algorithm Hash digest
SHA256 ccc59e52eeea5a574d8071eda8b35b1556e765901ad8643811d4a3b074c02184
MD5 2f2f1ffd15d66ceef676fbdabefb01d5
BLAKE2b-256 2388a3643ee5ecd315ae7d3aafc7af5e74ec617d148a8a2393e97e6b6d033929

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp35-cp35m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp35-cp35m-win32.whl
  • Upload date:
  • Size: 945.2 kB
  • Tags: CPython 3.5m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp35-cp35m-win32.whl
Algorithm Hash digest
SHA256 075a487dfa0dc151504c929f31f460b7f8b006af5849d0a64558f13046cce77d
MD5 5097cec9942a6540b0bb9cf521bba6e2
BLAKE2b-256 c43fa9d0584422dd2596676389af388ebbbd05b839596429d5bbc130f17c366e

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 1.9 MB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 d53248243ca5570bccf7739f4ee2d8d8094bae4bb83527746ff1eb3a1ac261c7
MD5 f6cd1b8b0484cd745bc250397bbf8a9d
BLAKE2b-256 f070ba4194cb9c60a5428b107d9bfa032b74e83aa76814e364447c6252c26535

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc1-cp35-cp35m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc1-cp35-cp35m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.5m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc1-cp35-cp35m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 4e1535f1af22c20389d977a15531aaa5fb4d3f19f1ba474715d6b918844d5661
MD5 386e9bef21f7a455d2b7e5011b40631e
BLAKE2b-256 5cfd4d917096028a7e1030f8e679e21c94377c5afa4f920580497b8fa2f3d9dc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page