Skip to main content

Fast and Customizable Tokenizers

Project description



Build GitHub


Tokenizers

Provides an implementation of today's most used tokenizers, with a focus on performance and versatility.

Bindings over the Rust implementation. If you are interested in the High-level design, you can go check it there.

Otherwise, let's dive in!

Main features:

  • Train new vocabularies and tokenize using 4 pre-made tokenizers (Bert WordPiece and the 3 most common BPE versions).
  • Extremely fast (both training and tokenization), thanks to the Rust implementation. Takes less than 20 seconds to tokenize a GB of text on a server's CPU.
  • Easy to use, but also extremely versatile.
  • Designed for research and production.
  • Normalization comes with alignments tracking. It's always possible to get the part of the original sentence that corresponds to a given token.
  • Does all the pre-processing: Truncate, Pad, add the special tokens your model needs.

Installation

With pip:

pip install tokenizers

From sources:

To use this method, you need to have the Rust installed:

# Install with:
curl https://sh.rustup.rs -sSf | sh -s -- -y
export PATH="$HOME/.cargo/bin:$PATH"

Once Rust is installed, you can compile doing the following

git clone https://github.com/huggingface/tokenizers
cd tokenizers/bindings/python

# Create a virtual env (you can use yours as well)
python -m venv .env
source .env/bin/activate

# Install `tokenizers` in the current virtual env
pip install setuptools_rust
python setup.py install

Using the provided Tokenizers

Using a pre-trained tokenizer is really simple:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
tokenizer = CharBPETokenizer(vocab, merges)

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded.ids)
print(encoded.tokens)

And you can train yours just as simply:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
tokenizer = CharBPETokenizer()

# Then train it!
tokenizer.train([ "./path/to/files/1.txt", "./path/to/files/2.txt" ])

# And you can use it
encoded = tokenizer.encode("I can feel the magic, can you?")

# And finally save it somewhere
tokenizer.save("./path/to/directory", "my-bpe")

Provided Tokenizers

  • CharBPETokenizer: The original BPE
  • ByteLevelBPETokenizer: The byte level version of the BPE
  • SentencePieceBPETokenizer: A BPE implementation compatible with the one used by SentencePiece
  • BertWordPieceTokenizer: The famous Bert tokenizer, using WordPiece

All of these can be used and trained as explained above!

Build your own

You can also easily build your own tokenizers, by putting all the different parts you need together:

Use a pre-trained tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, processors

# Load a BPE Model
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
bpe = models.BPE(vocab, merges)

# Initialize a tokenizer
tokenizer = Tokenizer(bpe)

# Customize pre-tokenization and decoding
tokenizer.pre_tokenizer = pre_tokenizers.ByteLevel(add_prefix_space=True)
tokenizer.decoder = decoders.ByteLevel()
tokenizer.post_processor = processors.ByteLevel(trim_offsets=True)

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded.ids)
print(encoded.tokens)

# Or tokenize multiple sentences at once:
encoded = tokenizer.encode_batch([
	"I can feel the magic, can you?",
	"The quick brown fox jumps over the lazy dog"
])
print(encoded)

Train a new tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, trainers, processors

# Initialize a tokenizer
tokenizer = Tokenizer(models.BPE())

# Customize pre-tokenization and decoding
tokenizer.pre_tokenizer = pre_tokenizers.ByteLevel(add_prefix_space=True)
tokenizer.decoder = decoders.ByteLevel()
tokenizer.post_processor = processors.ByteLevel(trim_offsets=True)

# And then train
trainer = trainers.BpeTrainer(vocab_size=20000, min_frequency=2)
tokenizer.train(trainer, [
	"./path/to/dataset/1.txt",
	"./path/to/dataset/2.txt",
	"./path/to/dataset/3.txt"
])

# Now we can encode
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenizers-0.7.0rc7.tar.gz (81.1 kB view details)

Uploaded Source

Built Distributions

tokenizers-0.7.0rc7-cp38-cp38-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.8 Windows x86-64

tokenizers-0.7.0rc7-cp38-cp38-win32.whl (973.4 kB view details)

Uploaded CPython 3.8 Windows x86

tokenizers-0.7.0rc7-cp38-cp38-manylinux1_x86_64.whl (7.5 MB view details)

Uploaded CPython 3.8

tokenizers-0.7.0rc7-cp38-cp38-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.8 macOS 10.10+ x86-64

tokenizers-0.7.0rc7-cp37-cp37m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.7m Windows x86-64

tokenizers-0.7.0rc7-cp37-cp37m-win32.whl (974.4 kB view details)

Uploaded CPython 3.7m Windows x86

tokenizers-0.7.0rc7-cp37-cp37m-manylinux1_x86_64.whl (5.6 MB view details)

Uploaded CPython 3.7m

tokenizers-0.7.0rc7-cp37-cp37m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.7m macOS 10.10+ x86-64

tokenizers-0.7.0rc7-cp36-cp36m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.6m Windows x86-64

tokenizers-0.7.0rc7-cp36-cp36m-win32.whl (974.0 kB view details)

Uploaded CPython 3.6m Windows x86

tokenizers-0.7.0rc7-cp36-cp36m-manylinux1_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.6m

tokenizers-0.7.0rc7-cp36-cp36m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.6m macOS 10.10+ x86-64

tokenizers-0.7.0rc7-cp35-cp35m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.5m Windows x86-64

tokenizers-0.7.0rc7-cp35-cp35m-win32.whl (974.0 kB view details)

Uploaded CPython 3.5m Windows x86

tokenizers-0.7.0rc7-cp35-cp35m-manylinux1_x86_64.whl (1.9 MB view details)

Uploaded CPython 3.5m

tokenizers-0.7.0rc7-cp35-cp35m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.5m macOS 10.10+ x86-64

File details

Details for the file tokenizers-0.7.0rc7.tar.gz.

File metadata

  • Download URL: tokenizers-0.7.0rc7.tar.gz
  • Upload date:
  • Size: 81.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7.tar.gz
Algorithm Hash digest
SHA256 721dd161e8e123378d292e7a4082a0c10adfd703bfe79c34f687309288dab570
MD5 ee7d3a26ad4315c6324fcaaff76203c4
BLAKE2b-256 30a3aa559a386f7f7264e28fea2d93363833090a650939758be10155497afd0d

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 50787475110999e1046bbd1239c96f13f1de3a8b65307c403e8565a92f466125
MD5 6c6afae5ca398f88ea01de2b389729ee
BLAKE2b-256 ae9cacad6c803a26812ced65fdc956cfe8f5810a623e9492279a6227e449ac65

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp38-cp38-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp38-cp38-win32.whl
  • Upload date:
  • Size: 973.4 kB
  • Tags: CPython 3.8, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp38-cp38-win32.whl
Algorithm Hash digest
SHA256 d9e721f7d29d79df5603573d0bf6702e4635e8a62b2eeff87cb3beb2502da03f
MD5 28b33671f6537f26ee1431598278715a
BLAKE2b-256 7209275e432c79b5bbfdf5072bd3b5532f4871bb149e41a5ffbe631338c27ec8

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 7.5 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 2af562723cbf9b5a7de6df842bf5ce5927a68cc45b7bf771033a01c6c49e28f0
MD5 c35e4bdf4f01fb04f11d8bac7ff3e958
BLAKE2b-256 9be82dc05dab76247daeab2a7a54db9ad098f100249caae7ee394d4d7c19f84b

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp38-cp38-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp38-cp38-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.8, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp38-cp38-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 18e80cebbf987fa16d7177188b81c64c77ba2ccc22d5074484363e55d107f601
MD5 2d58567ee89a7123a7c74bad380324b2
BLAKE2b-256 b0737fd826fce50c93c2ed51a50351fb3dfb9d8d194130042334d91afba564b5

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 b62b1f1b0230508e9429649f600a7115de160eda2eed53283dbe64328b2575f0
MD5 fe60bf148acf9a84881f94cec3466393
BLAKE2b-256 d1f928e4ebcf0c2f2239cd535d383dbf2b7916624150cbd01fa0ddd275a32589

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp37-cp37m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp37-cp37m-win32.whl
  • Upload date:
  • Size: 974.4 kB
  • Tags: CPython 3.7m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp37-cp37m-win32.whl
Algorithm Hash digest
SHA256 d425ef9dee7900dd07ec76dc9371bb1ad5902cd234a2eebf9e78d3a0018eaa00
MD5 ad407f2fed39985026baf688a6c97041
BLAKE2b-256 6871673b8b7eb37d3dfc00f527665259982a791f31aaa33c35c2696d37988a95

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 5.6 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 5586592d5ffbdcdd21fb07071ac37dac96d78e19369fb79f297b85ddd33e2b53
MD5 e1139f424069dc4ab64fee072bf798fb
BLAKE2b-256 17f6c910fd504ded3072c5c810a5c1572c41e7cec5a5f7879d44be533ab881a4

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp37-cp37m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp37-cp37m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.7m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp37-cp37m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 3d562da80c07ac1db6e35094dadd594f66c60bd799f8ad8578b3e9341b7b6436
MD5 632d308951a79aefc4ba4eba4ef55da1
BLAKE2b-256 98ad78cccc65cd4bdf09c7e2a062d0a92ff4cf15d9dc3d9617bdedc57909a231

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 bc81e6183a4aca649635e0949facfb12a04a7978a4a28afa433c1bb6f7de7019
MD5 757d05628daa363a766e33de78b0b079
BLAKE2b-256 fc8c747b909ddf4bd7ccd05988375808013d6bbc0a056e4e4ba9b013d4756e35

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp36-cp36m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp36-cp36m-win32.whl
  • Upload date:
  • Size: 974.0 kB
  • Tags: CPython 3.6m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp36-cp36m-win32.whl
Algorithm Hash digest
SHA256 9696f2c10f4d68f1ae7675f2b94d07184ca737ebf259e0c13d90d06aa2510172
MD5 354f9e0066485e25b9b83fc2060ed258
BLAKE2b-256 f21869e3a46970123d6483f4c0d615830998b005265c87655ba48067051fef2a

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.8 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 0aaa97812f954499367dd696828432605793df0cd8c391c7fc20f428a3390cef
MD5 de7be998853cbb6f3d356352c0fde67f
BLAKE2b-256 d767bcc06fc10f702b31ca6e7357775e5ba7b87e14cf276cd51940ddbe7354b6

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp36-cp36m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp36-cp36m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.6m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp36-cp36m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 31a2a64e63b330acd88692be394eac4c5cff5b59cec3b588721981c62abc9334
MD5 b3591737c3e54339c27e2eded177247a
BLAKE2b-256 6409ba6de5eeee929be0903532bfdb974a6598a07cdb4f6f343740a7a2c8ba95

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp35-cp35m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp35-cp35m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.5m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp35-cp35m-win_amd64.whl
Algorithm Hash digest
SHA256 6e707f61412937e76e6f2306188771091917e798a1d12050d440cb3371bfa245
MD5 3b761ff26a00f7231f6be0cdc204d4a9
BLAKE2b-256 da8d02da1b436ea11a16244f7514bb756be6a4d57421e5c1f4ef4cd9df754eef

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp35-cp35m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp35-cp35m-win32.whl
  • Upload date:
  • Size: 974.0 kB
  • Tags: CPython 3.5m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp35-cp35m-win32.whl
Algorithm Hash digest
SHA256 e9f01182d614303e349233c95d6c5ba660376d2a99ffef34087af1fec255f7f1
MD5 82cc3768fb41304e6a671220a8af462e
BLAKE2b-256 e69e4f35d83d66e6e48cc40c01c7aff9fb5242c4ee4407f1b15f2e2771715751

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 1.9 MB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 865299891c4528fd638ab9c8fe5e1ac89041b194efb4b67a386c247fc7d2c132
MD5 c70ec60c2f240c0f94c950556b1ed79f
BLAKE2b-256 a68bfdb1749b8c095ce1f4a8173ccae1ef542a3843d7721daf236fc0602a40ce

See more details on using hashes here.

File details

Details for the file tokenizers-0.7.0rc7-cp35-cp35m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc7-cp35-cp35m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.5m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc7-cp35-cp35m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 39f88df692a04d690aee4282349e881da26e603872605cb102326451e6746f32
MD5 c02e67276b375900ea9d28e3ec005836
BLAKE2b-256 a229cbc277b12aa60e1cb71af11f245e53732c03ae3b711c2ab64aa523122bc6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page