Skip to main content

LightGM Tools

Project description

LightGBM Tools

MIT License Python Version pypi

This Python package implements tools for LightGBM. In the current version lightgbm-tools focuses on binary classification metrics.

What exact problem does this tool solve?

LightGBM has some built-in metrics that can be used. These are useful but limited. Some important metrics are missing. These are, among others, the F1-score and the average precision (AP).

These metrics can be easily added using this tool. This happens through a mechanism built into LightGBM where we can assign a callback to the feval parameter of lightgbm.train (see lightgbm.train documentation).

Maintainers

One Conversation
This project is maintained by the One Conversation team of Deutsche Telekom AG.

Usage

You can find a fully functional example here: https://github.com/telekom/lightgbm-tools/blob/main/examples/main_usage.py

The easiest way is to use the predefined callback functions. These are:

  • lightgbm_tools.metrics.lgbm_f1_score_callback
  • lightgbm_tools.metrics.lgbm_accuracy_score_callback
  • lightgbm_tools.metrics.lgbm_average_precision_score_callback
  • lightgbm_tools.metrics.lgbm_roc_auc_score_callback
  • lightgbm_tools.metrics.lgbm_recall_score_callback
  • lightgbm_tools.metrics.lgbm_precision_score_callback

Here F1 is used as an example to show how the predefined callback functions can be used:

import lightgbm
from lightgbm_tools.metrics import lgbm_f1_score_callback

bst = lightgbm.train(
    params,
    train_data,
    valid_sets=val_data,
    num_boost_round=6,
    verbose_eval=False,
    evals_result=evals_result,
    feval=lgbm_f1_score_callback,  # here we pass the callback to LightGBM
)

You can also reuse other implementations of metrics. Here is an example of how to do this using the balanced_accuracy_score from scikit-learn:

import lightgbm
from sklearn.metrics import balanced_accuracy_score
from lightgbm_tools.metrics import LightGbmEvalFunction, binary_eval_callback_factory

# define own custom eval (metric) function for balanced_accuracy_score
lgbm_balanced_accuracy = LightGbmEvalFunction(
    name="balanced_accuracy",
    function=balanced_accuracy_score,
    is_higher_better=True,
    needs_binary_predictions=True,
)

# use the factory function to create the callback
lgbm_balanced_accuracy_callback = binary_eval_callback_factory([lgbm_balanced_accuracy])

bst = lightgbm.train(
    params,
    train_data,
    valid_sets=val_data,
    num_boost_round=6,
    verbose_eval=False,
    evals_result=evals_result,
    feval=lgbm_balanced_accuracy_callback,  # here we pass the callback to LightGBM
)

This tool can also be used to calculate multiple metrics at the same time. It can be done by passing several definitions of metrics (in a list) to the binary_eval_callback_factory. The followring predefined metric definitions (LightGbmEvalFunction) are available:

  • lightgbm_tools.metrics.lgbm_f1_score
  • lightgbm_tools.metrics.lgbm_accuracy_score
  • lightgbm_tools.metrics.lgbm_average_precision_score
  • lightgbm_tools.metrics.lgbm_roc_auc_score
  • lightgbm_tools.metrics.lgbm_recall_score
  • lightgbm_tools.metrics.lgbm_precision_score

Below is an example how to combine F1 and average precision:

import lightgbm
from lightgbm_tools.metrics import (
    binary_eval_callback_factory,
    lgbm_average_precision_score,
    lgbm_f1_score,
)

# use the factory function to create the callback
callback = binary_eval_callback_factory([lgbm_average_precision_score, lgbm_f1_score])

bst = lightgbm.train(
    params,
    train_data,
    valid_sets=val_data,
    num_boost_round=6,
    verbose_eval=False,
    evals_result=evals_result,
    feval=callback,  # here we pass the callback to LightGBM
)

Installation

lightgbm-tools can be installed with pip:

pip install lightgbm-tools

To do development and run unit tests locally, ensure that you have installed all relevant requirements. You will probably want to install it in "editable mode" if you are developing locally:

pip install -e .[all]

Licensing

Copyright (c) 2022 Philip May, Deutsche Telekom AG

Licensed under the MIT License (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License by reviewing the file LICENSE in the repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lightgbm_tools-0.0.2.tar.gz (6.1 kB view details)

Uploaded Source

Built Distribution

lightgbm_tools-0.0.2-py3-none-any.whl (5.9 kB view details)

Uploaded Python 3

File details

Details for the file lightgbm_tools-0.0.2.tar.gz.

File metadata

  • Download URL: lightgbm_tools-0.0.2.tar.gz
  • Upload date:
  • Size: 6.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.1

File hashes

Hashes for lightgbm_tools-0.0.2.tar.gz
Algorithm Hash digest
SHA256 efc17f88a9183e9c26bff0aabafbac3ce61b915f52e21725df212e0974d63be6
MD5 e1f99094ba685e95153faed560fe3219
BLAKE2b-256 ce3f5c3f0124df63af193a0e025fcefd2222a9d8dfef49fc5b00644e4a1be9fa

See more details on using hashes here.

File details

Details for the file lightgbm_tools-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for lightgbm_tools-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 904bd71c95a6b00e136348b6ba29542bce837af4cc24959b52070d4d3f855b92
MD5 675de41c67db0b12522b5163fc3ede67
BLAKE2b-256 cfc640fdf972ff78106bfac246a9d6f67a12fbcf878fc1d437767c366cac3f9f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page