Skip to main content

Fast and Customizable Tokenizers

Project description



Build GitHub


Tokenizers

Provides an implementation of today's most used tokenizers, with a focus on performance and versatility.

Bindings over the Rust implementation. If you are interested in the High-level design, you can go check it there.

Otherwise, let's dive in!

Main features:

  • Train new vocabularies and tokenize using 4 pre-made tokenizers (Bert WordPiece and the 3 most common BPE versions).
  • Extremely fast (both training and tokenization), thanks to the Rust implementation. Takes less than 20 seconds to tokenize a GB of text on a server's CPU.
  • Easy to use, but also extremely versatile.
  • Designed for research and production.
  • Normalization comes with alignments tracking. It's always possible to get the part of the original sentence that corresponds to a given token.
  • Does all the pre-processing: Truncate, Pad, add the special tokens your model needs.

Installation

With pip:

pip install tokenizers

From sources:

To use this method, you need to have the Rust installed:

# Install with:
curl https://sh.rustup.rs -sSf | sh -s -- -y
export PATH="$HOME/.cargo/bin:$PATH"

Once Rust is installed, you can compile doing the following

git clone https://github.com/huggingface/tokenizers
cd tokenizers/bindings/python

# Create a virtual env (you can use yours as well)
python -m venv .env
source .env/bin/activate

# Install `tokenizers` in the current virtual env
pip install setuptools_rust
python setup.py install

Using the provided Tokenizers

Using a pre-trained tokenizer is really simple:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
tokenizer = CharBPETokenizer(vocab, merges)

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded.ids)
print(encoded.tokens)

And you can train yours just as simply:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
tokenizer = CharBPETokenizer()

# Then train it!
tokenizer.train([ "./path/to/files/1.txt", "./path/to/files/2.txt" ])

# And you can use it
encoded = tokenizer.encode("I can feel the magic, can you?")

# And finally save it somewhere
tokenizer.save("./path/to/directory", "my-bpe")

Provided Tokenizers

  • CharBPETokenizer: The original BPE
  • ByteLevelBPETokenizer: The byte level version of the BPE
  • SentencePieceBPETokenizer: A BPE implementation compatible with the one used by SentencePiece
  • BertWordPieceTokenizer: The famous Bert tokenizer, using WordPiece

All of these can be used and trained as explained above!

Build your own

You can also easily build your own tokenizers, by putting all the different parts you need together:

Use a pre-trained tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, processors

# Load a BPE Model
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
bpe = models.BPE.from_files(vocab, merges)

# Initialize a tokenizer
tokenizer = Tokenizer(bpe)

# Customize pre-tokenization and decoding
tokenizer.pre_tokenizer = pre_tokenizers.ByteLevel(add_prefix_space=True)
tokenizer.decoder = decoders.ByteLevel()
tokenizer.post_processor = processors.ByteLevel(trim_offsets=True)

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded.ids)
print(encoded.tokens)

# Or tokenize multiple sentences at once:
encoded = tokenizer.encode_batch([
	"I can feel the magic, can you?",
	"The quick brown fox jumps over the lazy dog"
])
print(encoded)

Train a new tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, trainers, processors

# Initialize a tokenizer
tokenizer = Tokenizer(models.BPE.empty())

# Customize pre-tokenization and decoding
tokenizer.pre_tokenizer = pre_tokenizers.ByteLevel(add_prefix_space=True)
tokenizer.decoder = decoders.ByteLevel()
tokenizer.post_processor = processors.ByteLevel(trim_offsets=True)

# And then train
trainer = trainers.BpeTrainer(vocab_size=20000, min_frequency=2)
tokenizer.train(trainer, [
	"./path/to/dataset/1.txt",
	"./path/to/dataset/2.txt",
	"./path/to/dataset/3.txt"
])

# Now we can encode
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenizers-0.7.0rc3.tar.gz (79.0 kB view details)

Uploaded Source

Built Distributions

tokenizers-0.7.0rc3-cp38-cp38-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.8 Windows x86-64

tokenizers-0.7.0rc3-cp38-cp38-win32.whl (945.3 kB view details)

Uploaded CPython 3.8 Windows x86

tokenizers-0.7.0rc3-cp38-cp38-manylinux1_x86_64.whl (7.4 MB view details)

Uploaded CPython 3.8

tokenizers-0.7.0rc3-cp38-cp38-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.8 macOS 10.10+ x86-64

tokenizers-0.7.0rc3-cp37-cp37m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.7m Windows x86-64

tokenizers-0.7.0rc3-cp37-cp37m-win32.whl (945.7 kB view details)

Uploaded CPython 3.7m Windows x86

tokenizers-0.7.0rc3-cp37-cp37m-manylinux1_x86_64.whl (5.6 MB view details)

Uploaded CPython 3.7m

tokenizers-0.7.0rc3-cp37-cp37m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.7m macOS 10.10+ x86-64

tokenizers-0.7.0rc3-cp36-cp36m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.6m Windows x86-64

tokenizers-0.7.0rc3-cp36-cp36m-win32.whl (946.0 kB view details)

Uploaded CPython 3.6m Windows x86

tokenizers-0.7.0rc3-cp36-cp36m-manylinux1_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.6m

tokenizers-0.7.0rc3-cp36-cp36m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.6m macOS 10.10+ x86-64

tokenizers-0.7.0rc3-cp35-cp35m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.5m Windows x86-64

tokenizers-0.7.0rc3-cp35-cp35m-win32.whl (946.0 kB view details)

Uploaded CPython 3.5m Windows x86

tokenizers-0.7.0rc3-cp35-cp35m-manylinux1_x86_64.whl (1.9 MB view details)

Uploaded CPython 3.5m

tokenizers-0.7.0rc3-cp35-cp35m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.5m macOS 10.10+ x86-64

File details

Details for the file tokenizers-0.7.0rc3.tar.gz.

File metadata

  • Download URL: tokenizers-0.7.0rc3.tar.gz
  • Upload date:
  • Size: 79.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3.tar.gz
Algorithm Hash digest
SHA256 f2ecd4bdf530bd2a95ff19d301ee530429793a90a02a11612d0a400b5c4d9f31
MD5 c51ad6911d40b7e85606f9f10874eaee
BLAKE2b-256 502e68fd8a17ba22d8de4bd83aa296d0039e7ed52534ec2c0619db74da886a6f

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 4b7c2c3779dea2cf250992171cf043a9d74addc5e732cd30c5b142442d8fbe82
MD5 030893f3d3fd4b78feb87e192ca039b7
BLAKE2b-256 5cb42f3877966c6089468df7c888962893cb4b40a21e8c11a095a571643ad10e

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp38-cp38-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp38-cp38-win32.whl
  • Upload date:
  • Size: 945.3 kB
  • Tags: CPython 3.8, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp38-cp38-win32.whl
Algorithm Hash digest
SHA256 bf51ed4cb9eecf7490d70c7396ccd717736a34a2a5780e7d842e1b36c8e3391a
MD5 754109052b4973ddfa751da5878af82f
BLAKE2b-256 8ffe58df0a3358e071f92b4963b58d46bfe6e85c75a4dba91e55fb4bf08b5ea9

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 7.4 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 74bc0799746d443f007523f1b6e789cbffea24b80d149ca0717a01d289414ca7
MD5 24a44b0dbe2179deef7e477a0762c731
BLAKE2b-256 cfb1d07f1553fa9a5fb5cd7725227602b12edbd5c56709610aa2c344d503d289

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp38-cp38-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp38-cp38-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.8, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp38-cp38-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 87734f87281d9979a52196147ba7a54cf8c1f3da93b64db9e523bc2eb623c776
MD5 21550cecf48c40268c16ab3835075e45
BLAKE2b-256 0d8fd509e160aac681d9b38fcc46b4fa76f8c41f6b9b16c3ffb5131317f8581d

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 4676fbe5672f59a57ed87e76df7e4faed43e80b10610c86c1418d362b300a92c
MD5 63d12f81e3d3f767ac63f38384439fb4
BLAKE2b-256 8106a9cf8baf393fde7128f1829b901778fdc55c47fe78a4e5f35a7d9038ae14

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp37-cp37m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp37-cp37m-win32.whl
  • Upload date:
  • Size: 945.7 kB
  • Tags: CPython 3.7m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp37-cp37m-win32.whl
Algorithm Hash digest
SHA256 c2566a65fd4971d7e3696b4b79ea7b0a1292d5348ca288dd1f27a3b005a86a0f
MD5 94a44feff4f54b9bcfcde48cc06a089a
BLAKE2b-256 6d4a522bd8ae43ec0c7622894cc9d4340e0254cabf822e41e4ff188f6a6a64a5

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 5.6 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 0bb3a57f0a868dd532a917eb7c7e6d48fc041404797e01b5fde2f21dbaec2688
MD5 f21ee94240ace70d9e97ba1251a920e1
BLAKE2b-256 df5b3c100bc5be43a24fbc1610b4ac1d8c53ab7ea3e74be3ee1a3a0b1026cb7d

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp37-cp37m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp37-cp37m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.7m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp37-cp37m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 06d455c9b319d40ccded719ec2e1ba675aa43588f06053fbb3c4ea0e16c992e5
MD5 c8654f32f8eae563fedce36ea0d997f4
BLAKE2b-256 07bafecb52f5b67a48c31c57d935803545c2de932fd0d1a6195d30322d0587f8

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 42920dac12b478bdf697d83db10d03c5e80e8b10156cb32e7a210702502c267c
MD5 ad335e7cf1d644ee4ad07684546d3580
BLAKE2b-256 f9ee7f884493633c8a5cfcd8cedde0b9a4f5d0abd9db124520493caf0d3b482a

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp36-cp36m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp36-cp36m-win32.whl
  • Upload date:
  • Size: 946.0 kB
  • Tags: CPython 3.6m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp36-cp36m-win32.whl
Algorithm Hash digest
SHA256 cf6bc40e3d69e42bc86c3c1ce572cf2c7d42c7590f2ada017c43a455e5c3e0b7
MD5 e4c0c2d5de128aa5509705cd8026c551
BLAKE2b-256 7fa6e3004b04345999a8c05b1ccfaafbe2b02ef592c989863a65fa022092bcc8

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.7 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 8ce115c7cb0ac41cf84f715616d37d27c72cbedd1e577b0db38f10ba3a30b2f6
MD5 7cf39beca92368f6b6aad07e636cdd25
BLAKE2b-256 b812ef52510d173a347cec1939e61e7c5be88cc0d9cff995ba4ab1f30795bc31

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp36-cp36m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp36-cp36m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.6m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp36-cp36m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 c36a2162a028e49e6f3a85b605ddd339efd05b4e7b8ebdcdc71068aff7dcf2b7
MD5 9417a4075dd05ff056ab6cc3401dbe81
BLAKE2b-256 b522102bf194d9d8dde3aeee0f8b14f077088e5a3c6a4b71e5befccd36e97df2

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp35-cp35m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp35-cp35m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.5m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp35-cp35m-win_amd64.whl
Algorithm Hash digest
SHA256 dbf643acdafc933fd3ef78a7fe3b954d5ec2af75228377c9b2e3535e2bd30f33
MD5 0bada7be5c3686b16776c11a32343231
BLAKE2b-256 d6957d8c6b04861d5770a89cdf4a09582b017839bf263d571e3da2b24cb8aa86

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp35-cp35m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp35-cp35m-win32.whl
  • Upload date:
  • Size: 946.0 kB
  • Tags: CPython 3.5m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp35-cp35m-win32.whl
Algorithm Hash digest
SHA256 202b545dae2ec27a00698d25a030177bc28b5a9d6fda88b32f8d2574cd5b449e
MD5 7f38ef5136cc27177f129303b4090cba
BLAKE2b-256 5eeea772bb0e8909b3efe604c7782207b96cfa66b3a68ec5eb20fb9724ebc4c6

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 1.9 MB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 6db4e43385cd57865e10c33a1319e433749e746cc3cd2deb695b466844be0de5
MD5 7beb6764a930ddd1e3d04c11ee9f5f58
BLAKE2b-256 11bcf600dfe23d789275df485b538ae8dc64abe30dbb7a097105fc1aafea941a

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc3-cp35-cp35m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc3-cp35-cp35m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.5m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc3-cp35-cp35m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 1414ef5ef630dad0cb42e71be90872a7846f3ca5ba248e7b07e9b60e1c0fedc3
MD5 a16c8dcdf2135c1db82e9c19985daccf
BLAKE2b-256 79837d8ec3dfe3eccf476347676a27299df05e879d3db178689c91fc3c2731af

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page