Skip to main content

NSFW pipeline that classifies prompt, using a character embedding model

Reason this release was yanked:

wrong model

Project description

NSFW pipeline that classifies prompt, using a character embedding model

Feature | Description |
— | — |
Name | en_prompt_nsfw_pipeline_char_embed |
Version | 0.1.0 |
spaCy | >=2.0.0,<3.0.0 |
Default Pipeline | textcat |
Components | textcat |
Vectors | 0 keys, 0 unique vectors (0 dimensions) |
Sources | n/a |
License | UNLICENSED |
Author | [Jiayu Liu]() |

### Label Scheme

<details>

<summary>View label scheme (4 labels for 1 components)</summary>

Component | Labels |
— | — |
`textcat` | adult, cp, underage_safe, safe |

</details>

### Accuracy

Type | Score |
— | — |
CATS_SCORE | 88.95 |
CATS_MICRO_P | 89.02 |
CATS_MICRO_R | 89.02 |
CATS_MICRO_F | 89.02 |
CATS_MACRO_P | 89.27 |
CATS_MACRO_R | 88.75 |
CATS_MACRO_F | 88.95 |
CATS_MACRO_AUC | 97.80 |
TEXTCAT_LOSS | 686.95 |

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

File details

Details for the file en_prompt_nsfw_pipeline_char_embed-0.1.0.tar.gz.

File metadata

File hashes

Hashes for en_prompt_nsfw_pipeline_char_embed-0.1.0.tar.gz
Algorithm Hash digest
SHA256 57762c9208f42d591fd32d3b04938f417834b0e502b6f3d79154e95ab9a96d17
MD5 1a59b3054abed68cc073d9ba9b3b5227
BLAKE2b-256 2bf0e940cb52a93f17af72a421b5926a1f5badb61a02417e02d14b2253f3a5e7

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page