Skip to main content

This repository contains an easy and intuitive approach to few-shot NER using most similar expansion over spaCy embeddings. Now with entity confidence scores!

Project description

Concise Concepts

When wanting to apply NER to concise concepts, it is really easy to come up with examples, but pretty difficult to train an entire pipeline. Concise Concepts uses few-shot NER based on word embedding similarity to get you going with easy! Now with entity scoring!

Python package Current Release Version pypi Version PyPi downloads Code style: black

Install

pip install concise-concepts

Quickstart

import spacy
from spacy import displacy
import concise_concepts

data = {
    "fruit": ["apple", "pear", "orange"],
    "vegetable": ["broccoli", "spinach", "tomato"],
    "meat": ["beef", "pork", "fish", "lamb"]
}

text = """
    Heat the oil in a large pan and add the Onion, celery and carrots. 
    Then, cook over a medium–low heat for 10 minutes, or until softened. 
    Add the courgette, garlic, red peppers and oregano and cook for 2–3 minutes.
    Later, add some oranges and chickens. """

nlp = spacy.load("en_core_web_lg", disable=["ner"])
# ent_score for entity condifence scoring
nlp.add_pipe("concise_concepts", config={"data": data, "ent_score": True})
doc = nlp(text)

options = {"colors": {"fruit": "darkorange", "vegetable": "limegreen", "meat": "salmon"},
           "ents": ["fruit", "vegetable", "meat"]}

ents = doc.ents
for ent in ents:
    new_label = f"{ent.label_} ({float(ent._.ent_score):.0%})"
    options["colors"][new_label] = options["colors"].get(ent.label_.lower(), None)
    options["ents"].append(new_label)
    ent.label_ = new_label
doc.ents = ents

displacy.render(doc, style="ent", options=options)

use specific number of words to expand over

data = {
    "fruit": ["apple", "pear", "orange"],
    "vegetable": ["broccoli", "spinach", "tomato"],
    "meat": ["beef", "pork", "fish", "lamb"]
}

topn = [50, 50, 150]

assert len(topn) == len

nlp.add_pipe("concise_concepts", config={"data": data, "topn": topn})

use word similarity to score entities

import spacy
import concise_concepts

data = {
    "ORG": ["Google", "Apple", "Amazon"],
    "GPE": ["Netherlands", "France", "China"],
}

text = """Sony was founded in Japan."""

nlp = spacy.load("en_core_web_lg")
nlp.add_pipe("concise_concepts", config={"data": data, "ent_score": True})
doc = nlp(text)

print([(ent.text, ent.label_, ent._.ent_score) for ent in doc.ents])
# output
#
# [('Sony', 'ORG', 0.63740385), ('Japan', 'GPE', 0.5896993)]

use gensim.word2vec model from pre-trained gensim or custom model path

data = {
    "fruit": ["apple", "pear", "orange"],
    "vegetable": ["broccoli", "spinach", "tomato"],
    "meat": ["beef", "pork", "fish", "lamb"]
}

# model from https://radimrehurek.com/gensim/downloader.html or path to local file
model_path = "glove-twitter-25"

nlp.add_pipe("concise_concepts", config={"data": data, "model_path": model_path})

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

concise-concepts-0.5.3.tar.gz (7.9 kB view details)

Uploaded Source

Built Distribution

concise_concepts-0.5.3-py3-none-any.whl (9.5 kB view details)

Uploaded Python 3

File details

Details for the file concise-concepts-0.5.3.tar.gz.

File metadata

  • Download URL: concise-concepts-0.5.3.tar.gz
  • Upload date:
  • Size: 7.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.11 CPython/3.8.2 Windows/10

File hashes

Hashes for concise-concepts-0.5.3.tar.gz
Algorithm Hash digest
SHA256 9f896830801c9f342975aac4bf4597e3d20ebdd68b4e70ca647a7d214a0a4ae0
MD5 3adb980dbdafa34d6cfb22a4b33d20d8
BLAKE2b-256 b46c9a8a0def5eb3b61c4a92be345fd9a1b80e2840b1d12b81bd0253cc9e05a3

See more details on using hashes here.

File details

Details for the file concise_concepts-0.5.3-py3-none-any.whl.

File metadata

File hashes

Hashes for concise_concepts-0.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 3d740f74fd659826c0d32435580ce1b2e286dabbf18ef51d71a9daaea3e0e518
MD5 ae063e9ec6ecb43f1dbde0dc7169031d
BLAKE2b-256 fb846dbe7e73385da13ba912d87ae6406367248d8e714a8c8f9ed67ad0f16e36

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page