NSFW pipeline that classifies prompt, using a bi-lstm model
Project description
NSFW pipeline that classifies prompt, using a bi-lstm model
Feature | Description |
— | — |
Name | en_prompt_nsfw_pipeline_bilstm |
Version | 0.3.0 |
spaCy | >=3.0.0,<4.0.0 |
Default Pipeline | tok2vec, textcat |
Components | tok2vec, textcat |
Vectors | 514157 keys, 20000 unique vectors (300 dimensions) |
Sources | n/a |
License | UNLICENSED |
Author | [Jiayu Liu]() |
### Label Scheme
<details>
<summary>View label scheme (4 labels for 1 components)</summary>
Component | Labels |
— | — |
`textcat` | safe, cp, underage_safe, adult |
</details>
### Accuracy
Type | Score |
— | — |
CATS_SCORE | 91.79 |
CATS_MICRO_P | 92.20 |
CATS_MICRO_R | 92.20 |
CATS_MICRO_F | 92.20 |
CATS_MACRO_P | 92.48 |
CATS_MACRO_R | 91.25 |
CATS_MACRO_F | 91.79 |
CATS_MACRO_AUC | 98.97 |
TOK2VEC_LOSS | 149989.52 |
TEXTCAT_LOSS | 626.11 |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Close
Hashes for en_prompt_nsfw_pipeline_bilstm-0.3.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | b06639fb920879650c07589fc8b81a392d778c4b2f647d9de56b93b89ce5eebb |
|
MD5 | 1682e3fd3c719930e7bfda7935d47fb3 |
|
BLAKE2b-256 | 708abee063f69148c6fc532df2f1e0befa80dee2a9f34406e48f4fdb741d1df7 |