Skip to main content

Fairness Indicators

Project description

# Fairness Indicators BETA

![Fairness_Indicators](./images/fairnessIndicators.png)

Fairness Indicators is designed to support teams in evaluating and improving models for fairness concerns in partnership with the broader Tensorflow toolkit.

The tool is currently actively used internally by many of our products, and is now available in BETA for you to try for your own use cases. We would love to partner with you to understand where Fairness Indicators is most useful, and where added functionality would be valuable. Please reach out at tfx@tensorflow.org. You can provide any feedback on your experience, and feature requests, here.

## What is Fairness Indicators? Fairness Indicators enables easy computation of commonly-identified fairness metrics for binary and multiclass classifiers.

Many existing tools for evaluating fairness concerns don’t work well on large scale datasets and models. At Google, it is important for us to have tools that can work on billion-user systems. Fairness Indicators will allow you to evaluate across any size of use case.

In particular, Fairness Indicators includes the ability to:

  • Evaluate the distribution of datasets

  • Evaluate model performance, sliced across defined groups of users * Feel confident about your results with confidence intervals and evals at multiple thresholds

  • Dive deep into individual slices to explore root causes and opportunities for improvement

This [Introductory Video](https://www.youtube.com/watch?v=pHT-ImFXPQo) provides an example of how Fairness Indicators can be used on of our own products to evaluate fairness concerns overtime. This Demo Colab provides a hands-on experience of using Fairness Indicators.

[![](http://img.youtube.com/vi/pHT-ImFXPQo/0.jpg)](http://www.youtube.com/watch?v=pHT-ImFXPQo “”)

The pip package download includes:

  • Tensorflow Data Analysis (TFDV) [analyze distribution of your dataset]

  • Tensorflow Model Analysis (TFMA) [analyze model performance] * Fairness Indicators [an addition to TFMA that adds fairness metrics and the ability to easily compare performance across slices]

  • The What-If Tool (WIT) [an interactive visual interface designed to probe your models better]

## How can I use Fairness Indicators? Tensorflow Models

Not using existing Tensorflow tools? No worries!

Non-Tensorflow Models

## Examples

The [examples](https://github.com/tensorflow/fairness-indicators/tree/master/examples) directory contains several examples.

## More questions? For more information on how to think about fairness evaluation in the context of your use case, see [this link](https://github.com/tensorflow/fairness-indicators/blob/master/documentation/guidance.md).

If you have found a bug in Fairness Indicators, please file a GitHub issue with as much supporting information as you can provide.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fairness_indicators-0.1.0.dev3.tar.gz (121.3 kB view details)

Uploaded Source

Built Distributions

fairness_indicators-0.1.0.dev3-py3-none-any.whl (127.6 kB view details)

Uploaded Python 3

fairness_indicators-0.1.0.dev3-py2-none-any.whl (127.6 kB view details)

Uploaded Python 2

File details

Details for the file fairness_indicators-0.1.0.dev3.tar.gz.

File metadata

  • Download URL: fairness_indicators-0.1.0.dev3.tar.gz
  • Upload date:
  • Size: 121.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.3

File hashes

Hashes for fairness_indicators-0.1.0.dev3.tar.gz
Algorithm Hash digest
SHA256 3f77c4a24e6bca3e9b0b518a58873f5035017ba923838128b1332fcd503b0ea3
MD5 f7efad0477cb70e8a9993d81d11fb622
BLAKE2b-256 16e0c61c28659cb6840ec48b59b92503df64c8e58606321c90c575a4f67dece9

See more details on using hashes here.

Provenance

File details

Details for the file fairness_indicators-0.1.0.dev3-py3-none-any.whl.

File metadata

  • Download URL: fairness_indicators-0.1.0.dev3-py3-none-any.whl
  • Upload date:
  • Size: 127.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.3

File hashes

Hashes for fairness_indicators-0.1.0.dev3-py3-none-any.whl
Algorithm Hash digest
SHA256 c4fe7f8a97bd0470728ee3968707d1b5b8ab23ff7edffb2eaf6d5ce1cc666426
MD5 6162258e936ca90ae7bc5fde531f8236
BLAKE2b-256 759ce54f998b3252eb36f329b54da9c28a12f2ea7f9ae971a0aefe5c3f7c9180

See more details on using hashes here.

Provenance

File details

Details for the file fairness_indicators-0.1.0.dev3-py2-none-any.whl.

File metadata

  • Download URL: fairness_indicators-0.1.0.dev3-py2-none-any.whl
  • Upload date:
  • Size: 127.6 kB
  • Tags: Python 2
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.3

File hashes

Hashes for fairness_indicators-0.1.0.dev3-py2-none-any.whl
Algorithm Hash digest
SHA256 bd90dd11488b0ffd9e758c76cef07d03c27f2174572853602a8e897fc688fbd2
MD5 ed55adfd8643c815d321a8a6c38ab299
BLAKE2b-256 01be8137fd67e2221138546ee527b3bc1cd5aa90f1dc5fbfb229dd5d53b7df80

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page