Skip to main content

A Distributed DataFrame library for large scale complex data processing.

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: the distributed Python dataframe for complex data

Daft is a fast, Pythonic and scalable open-source dataframe library built for Python and Machine Learning workloads.

Daft is currently in its Beta release phase - please expect bugs and rapid improvements to the project. We welcome user feedback/feature requests in our Discussions forums

Table of Contents

About Daft

The Daft dataframe is a table of data with rows and columns. Columns can contain any Python objects, which allows Daft to support rich complex data types such as images, audio, video and more.

  1. Any Data: Columns can contain any Python objects, which means that the Python libraries you already use for running machine learning or custom data processing will work natively with Daft!

  2. Notebook Computing: Daft is built for the interactive developer experience on a notebook - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Rich complex formats such as images can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket and run a simple function to generate thumbnails for each image:

import daft as daft

import io
from PIL import Image

def get_thumbnail(img: Image.Image) -> Image.Image:
    """Simple function to make an image thumbnail"""
    imgcopy = img.copy()
    imgcopy.thumbnail((48, 48))
    return imgcopy

# Load a dataframe from files in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# Get the AWS S3 url of each image
df = df.select(df["path"].alias("s3_url"))

# Download images and load as a PIL Image object
df = df.with_column("image", df["s3_url"].url.download().apply(lambda data: Image.open(io.BytesIO(data)), return_dtype=daft.DataType.python()))

# Generate thumbnails from images
df = df.with_column("thumbnail", df["image"].apply(get_thumbnail, return_dtype=daft.DataType.python()))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.1.7.tar.gz (584.3 kB view details)

Uploaded Source

Built Distributions

getdaft-0.1.7-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (9.4 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.1.7-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (8.7 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.1.7-cp37-abi3-macosx_11_0_arm64.whl (7.9 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.1.7-cp37-abi3-macosx_10_7_x86_64.whl (8.8 MB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.1.7.tar.gz.

File metadata

  • Download URL: getdaft-0.1.7.tar.gz
  • Upload date:
  • Size: 584.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.16

File hashes

Hashes for getdaft-0.1.7.tar.gz
Algorithm Hash digest
SHA256 3b8655253ff722efda3b411244b67b6285395a41114853db43efaf37fd0dc30a
MD5 0cae1c3eae6a7b8cfa42c84c42080ce9
BLAKE2b-256 f28b817f71ad19fc945c35f55bffc8558ac48b73b5d5cd91fb1b32834949d8cc

See more details on using hashes here.

File details

Details for the file getdaft-0.1.7-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.7-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 61e334540caf6866835335d4cc5fec5a2cb0a6898b19ca87b528d6ab662af1cb
MD5 b078803b72194c84f2b71cd7ab3f61d2
BLAKE2b-256 1d4a318817c2e6b313384d0ceb299af6047ec15085a8c68bf292ba6969e96eed

See more details on using hashes here.

File details

Details for the file getdaft-0.1.7-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.7-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 dac1691ad92e3649b64d70d7c3d5b5f334fea29fc944942e31f9af69917b9e34
MD5 1da6c504da40337e6c73df67e5ce8301
BLAKE2b-256 cad0b8aec33adadb027ed928713ccb96e05446b38caaf54a22f5a4ddfbad131d

See more details on using hashes here.

File details

Details for the file getdaft-0.1.7-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.7-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1c55a8f82bcf7fafe3fa9526487f04c56bb8eb29799aace57c975c3584638a1c
MD5 19575da30e7a8b4243fe56f0fbf807ad
BLAKE2b-256 13d8ce701e171df91e64f6a8d3a07f38b4c4362fcc38684be2f5edd56efc94ee

See more details on using hashes here.

File details

Details for the file getdaft-0.1.7-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.7-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 927d485b6ae491c8199c150f584810ddfee4b8318bfc3f87a163fc3f9f84b05a
MD5 81c7a549b1ab95469e8d2a97a17c2fe7
BLAKE2b-256 5e1adf25d8d412be4ad8ebf11a909d3555c9a22b62193ad5801fd85a3febaa22

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page