Skip to main content

Fast and Customizable Tokenizers

Project description



Build GitHub


Tokenizers

Provides an implementation of today's most used tokenizers, with a focus on performance and versatility.

Bindings over the Rust implementation. If you are interested in the High-level design, you can go check it there.

Otherwise, let's dive in!

Main features:

  • Train new vocabularies and tokenize using 4 pre-made tokenizers (Bert WordPiece and the 3 most common BPE versions).
  • Extremely fast (both training and tokenization), thanks to the Rust implementation. Takes less than 20 seconds to tokenize a GB of text on a server's CPU.
  • Easy to use, but also extremely versatile.
  • Designed for research and production.
  • Normalization comes with alignments tracking. It's always possible to get the part of the original sentence that corresponds to a given token.
  • Does all the pre-processing: Truncate, Pad, add the special tokens your model needs.

Installation

With pip:

pip install tokenizers

From sources:

To use this method, you need to have the Rust installed:

# Install with:
curl https://sh.rustup.rs -sSf | sh -s -- -y
export PATH="$HOME/.cargo/bin:$PATH"

Once Rust is installed, you can compile doing the following

git clone https://github.com/huggingface/tokenizers
cd tokenizers/bindings/python

# Create a virtual env (you can use yours as well)
python -m venv .env
source .env/bin/activate

# Install `tokenizers` in the current virtual env
pip install setuptools_rust
python setup.py install

Using the provided Tokenizers

Using a pre-trained tokenizer is really simple:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
tokenizer = CharBPETokenizer(vocab, merges)

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded.ids)
print(encoded.tokens)

And you can train yours just as simply:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
tokenizer = CharBPETokenizer()

# Then train it!
tokenizer.train([ "./path/to/files/1.txt", "./path/to/files/2.txt" ])

# And you can use it
encoded = tokenizer.encode("I can feel the magic, can you?")

# And finally save it somewhere
tokenizer.save("./path/to/directory", "my-bpe")

Provided Tokenizers

  • CharBPETokenizer: The original BPE
  • ByteLevelBPETokenizer: The byte level version of the BPE
  • SentencePieceBPETokenizer: A BPE implementation compatible with the one used by SentencePiece
  • BertWordPieceTokenizer: The famous Bert tokenizer, using WordPiece

All of these can be used and trained as explained above!

Build your own

You can also easily build your own tokenizers, by putting all the different parts you need together:

Use a pre-trained tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, processors

# Load a BPE Model
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
bpe = models.BPE(vocab, merges)

# Initialize a tokenizer
tokenizer = Tokenizer(bpe)

# Customize pre-tokenization and decoding
tokenizer.pre_tokenizer = pre_tokenizers.ByteLevel(add_prefix_space=True)
tokenizer.decoder = decoders.ByteLevel()
tokenizer.post_processor = processors.ByteLevel(trim_offsets=True)

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded.ids)
print(encoded.tokens)

# Or tokenize multiple sentences at once:
encoded = tokenizer.encode_batch([
	"I can feel the magic, can you?",
	"The quick brown fox jumps over the lazy dog"
])
print(encoded)

Train a new tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, trainers, processors

# Initialize a tokenizer
tokenizer = Tokenizer(models.BPE())

# Customize pre-tokenization and decoding
tokenizer.pre_tokenizer = pre_tokenizers.ByteLevel(add_prefix_space=True)
tokenizer.decoder = decoders.ByteLevel()
tokenizer.post_processor = processors.ByteLevel(trim_offsets=True)

# And then train
trainer = trainers.BpeTrainer(vocab_size=20000, min_frequency=2)
tokenizer.train(trainer, [
	"./path/to/dataset/1.txt",
	"./path/to/dataset/2.txt",
	"./path/to/dataset/3.txt"
])

# Now we can encode
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenizers-0.7.0rc6.tar.gz (80.7 kB view details)

Uploaded Source

Built Distributions

tokenizers-0.7.0rc6-cp38-cp38-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.8 Windows x86-64

tokenizers-0.7.0rc6-cp38-cp38-win32.whl (972.5 kB view details)

Uploaded CPython 3.8 Windows x86

tokenizers-0.7.0rc6-cp38-cp38-manylinux1_x86_64.whl (7.5 MB view details)

Uploaded CPython 3.8

tokenizers-0.7.0rc6-cp38-cp38-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.8 macOS 10.10+ x86-64

tokenizers-0.7.0rc6-cp37-cp37m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.7m Windows x86-64

tokenizers-0.7.0rc6-cp37-cp37m-win32.whl (972.4 kB view details)

Uploaded CPython 3.7m Windows x86

tokenizers-0.7.0rc6-cp37-cp37m-manylinux1_x86_64.whl (5.6 MB view details)

Uploaded CPython 3.7m

tokenizers-0.7.0rc6-cp37-cp37m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.7m macOS 10.10+ x86-64

tokenizers-0.7.0rc6-cp36-cp36m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.6m Windows x86-64

tokenizers-0.7.0rc6-cp36-cp36m-win32.whl (972.4 kB view details)

Uploaded CPython 3.6m Windows x86

tokenizers-0.7.0rc6-cp36-cp36m-manylinux1_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.6m

tokenizers-0.7.0rc6-cp36-cp36m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.6m macOS 10.10+ x86-64

tokenizers-0.7.0rc6-cp35-cp35m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.5m Windows x86-64

tokenizers-0.7.0rc6-cp35-cp35m-win32.whl (972.4 kB view details)

Uploaded CPython 3.5m Windows x86

tokenizers-0.7.0rc6-cp35-cp35m-manylinux1_x86_64.whl (1.9 MB view details)

Uploaded CPython 3.5m

tokenizers-0.7.0rc6-cp35-cp35m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.5m macOS 10.10+ x86-64

File details

Details for the file tokenizers-0.7.0rc6.tar.gz.

File metadata

  • Download URL: tokenizers-0.7.0rc6.tar.gz
  • Upload date:
  • Size: 80.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6.tar.gz
Algorithm Hash digest
SHA256 feafd834a0d0806d1babe539a4b253667b7b18af94dffffbf1e1c2dd72eb1ef2
MD5 73394cc4c53b0a43bbf5c8b90904b32a
BLAKE2b-256 0a2860c2ed0975940971c1392680b6858569eb89d7725381de40a8ebbfd8e94f

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 885de26f5e81a8b6e2873ccaa848dbf4edc30a4fa0bb9cda975116c90c70afec
MD5 ea75cb7fb270c50e6daa515bee5fb099
BLAKE2b-256 aaaf2b675f3a4c71b0a3189e1880579b5b38a168045cf38a73f68bb35313dd13

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp38-cp38-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp38-cp38-win32.whl
  • Upload date:
  • Size: 972.5 kB
  • Tags: CPython 3.8, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp38-cp38-win32.whl
Algorithm Hash digest
SHA256 e14d1b966dcd5912ba55cb3333c30e7bbd4fc4e73df424845fe0b721b8289e31
MD5 cd9a0221e809d368d0b5c4777c614913
BLAKE2b-256 29ef4a24e2c4d26445779fea3ed44e6510ba51a122c584f9b3f7d8806cf35318

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 7.5 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 f2487632213465c871c9246966a62658442967c2a591cd5e852f860cefcca8fa
MD5 2169f9160c608798dbfe3c61f7ea12b6
BLAKE2b-256 c563582b86a442b1e0ff8b5ff442a878a31aa11a2c6e129316c967b3ef02962b

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp38-cp38-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp38-cp38-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.8, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp38-cp38-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 c56622459a5e42c5c90a035c36a5d6b633339143d24dfb2a06c2d313cffee476
MD5 e63c01394c16ee38f5dee868412e0f93
BLAKE2b-256 508d142aae69a28c40c578f4acf499606c5fded7cb8c151185477499ca74e67a

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 934515ed0c4f54b92c4c746f1a16f45b9da7d10a7009deb866870d321d0c81b0
MD5 03d874ce55fceeb4e6f8eff3cd0592a5
BLAKE2b-256 8b6fa543079e482b06cd7ab1e219b7a72eb3294d168208b3fc0462bf77ed8bbf

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp37-cp37m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp37-cp37m-win32.whl
  • Upload date:
  • Size: 972.4 kB
  • Tags: CPython 3.7m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp37-cp37m-win32.whl
Algorithm Hash digest
SHA256 d962ea1ba285c53fc3aa37fbda7dfe2d36725ce21420543537091b95a77b1892
MD5 c93a6d9b72d353bd62a301d7800870aa
BLAKE2b-256 67dcc728ac91c9a00360cb28d411dccafcd1794c6a82375855467bd686a99d09

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 5.6 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 4ba2067a5f0a3353c01de150e3da40d9d0a1de7d942a06a7cc8260870b1c35cd
MD5 622e761b899f6b4a2ba60ec7f0b8628c
BLAKE2b-256 e4d4d948c80cf5fbb2de8b92a5c28931f46bb39a9b8bd59d3524cf7ee706b8e0

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp37-cp37m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp37-cp37m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.7m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp37-cp37m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 f80da10c09caa3d766cd0c200f8411e48e942e5ec5757c880d17b02b20031da8
MD5 c212bc0c1bed86f46cdadb0436b1da48
BLAKE2b-256 41fa038a3abd8f58b5062b8a342f004d509b0844a19173dc3075d851982d0f58

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 afcf96657b3dd8aec5dd8ef7c3c73e6692f9ba54fff33b30a4533a0b9689b0d6
MD5 6bf37b137d154bf7e4e8088afad64f2d
BLAKE2b-256 748a731520e0c2895811c36cc35bfd9270aaa4c2513c814dc2e235ce70210238

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp36-cp36m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp36-cp36m-win32.whl
  • Upload date:
  • Size: 972.4 kB
  • Tags: CPython 3.6m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp36-cp36m-win32.whl
Algorithm Hash digest
SHA256 77ed3a8fa446dd21fdd8cfc15f94119fbd997dc3d19319ce146ec49fec8f7324
MD5 59cbfa58572d19a8d6218249ed6eadd4
BLAKE2b-256 0e8fd22a7a1a1d028cdb9fddb2db54a3f7a3b2f3932309185b6dee6caaec4a0a

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.8 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 944f3fc27024d36c0e7fbec1c7c025b08d123860dad773aa866e9f4f1c2777c8
MD5 ab4a87cf240eb99bc0869a0e8082e84a
BLAKE2b-256 d4cf9824085edd1068b90104e61ba5c93035399777518f67d481720106f35951

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp36-cp36m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp36-cp36m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.6m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp36-cp36m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 d97144c9274a447c1d8eebae209222c1c75d7be1045ce2c9097837ec1bc9f2ef
MD5 0e7d0298b255e89cb33fefa973200d10
BLAKE2b-256 458a4f25f73415c310d6930be1465151cd3edbea6a8d1c1cefab8e6955d8974d

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp35-cp35m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp35-cp35m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.5m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp35-cp35m-win_amd64.whl
Algorithm Hash digest
SHA256 411c56ebb9f4661f174621094bd9d29e97711e6b345f94f9e3ec09d41c98665c
MD5 e6861544f537aeb652a04241dd4d0e90
BLAKE2b-256 b9324d675c2e5c8973e319a5c243ea5bb81b5fb010cd7dda3b63888d61ca8d7d

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp35-cp35m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp35-cp35m-win32.whl
  • Upload date:
  • Size: 972.4 kB
  • Tags: CPython 3.5m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp35-cp35m-win32.whl
Algorithm Hash digest
SHA256 d9861a4defd72fe05f71b7a2719a57155a21d1137bcd5eb39b9c1a8af0bce7fb
MD5 ab345caee32645158be3de67f202c2c3
BLAKE2b-256 b7515a5655f6462a515793f1528d5481bdf989b35df3df4c94ad8f0d9dc95c4a

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 1.9 MB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 6761d88d01a23623d74cda2aabd7e93ca3835fc6696fcb5fcb67fff00d164d32
MD5 70fc59887f9ff727dcda5b5ecb97b3a1
BLAKE2b-256 69ad9a521eb4f9eda96896b9c2fdf65b4e2bb1cddd6c6da183c0487e4edeeb03

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc6-cp35-cp35m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc6-cp35-cp35m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.5m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc6-cp35-cp35m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 ce218ea4e24b6a390a58821d36dab71ebc7435fbff04433ed94b5c115c757647
MD5 85cd724b4361ab9daa5396ae0bd64d1b
BLAKE2b-256 4baf283e0e358a2435e2fe09293eb3855ab3894536ba27277b35cec43fc1aed9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page