Skip to main content

A Distributed DataFrame library for large scale complex data processing.

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: the distributed Python dataframe for complex data

Daft is a fast, Pythonic and scalable open-source dataframe library built for Python and Machine Learning workloads.

Daft is currently in its Beta release phase - please expect bugs and rapid improvements to the project. We welcome user feedback/feature requests in our Discussions forums

Table of Contents

About Daft

The Daft dataframe is a table of data with rows and columns. Columns can contain any Python objects, which allows Daft to support rich complex data types such as images, audio, video and more.

  1. Any Data: Columns can contain any Python objects, which means that the Python libraries you already use for running machine learning or custom data processing will work natively with Daft!

  2. Notebook Computing: Daft is built for the interactive developer experience on a notebook - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Rich complex formats such as images can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket and run a simple function to generate thumbnails for each image:

import daft as daft

import io
from PIL import Image

def get_thumbnail(img: Image.Image) -> Image.Image:
    """Simple function to make an image thumbnail"""
    imgcopy = img.copy()
    imgcopy.thumbnail((48, 48))
    return imgcopy

# Load a dataframe from files in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# Get the AWS S3 url of each image
df = df.select(df["path"].alias("s3_url"))

# Download images and load as a PIL Image object
df = df.with_column("image", df["s3_url"].url.download().apply(lambda data: Image.open(io.BytesIO(data)), return_dtype=daft.DataType.python()))

# Generate thumbnails from images
df = df.with_column("thumbnail", df["image"].apply(get_thumbnail, return_dtype=daft.DataType.python()))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

This version

0.1.5

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.1.5.tar.gz (557.3 kB view details)

Uploaded Source

Built Distributions

getdaft-0.1.5-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.1.5-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (4.0 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.1.5-cp37-abi3-macosx_11_0_arm64.whl (3.7 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.1.5-cp37-abi3-macosx_10_7_x86_64.whl (4.5 MB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.1.5.tar.gz.

File metadata

  • Download URL: getdaft-0.1.5.tar.gz
  • Upload date:
  • Size: 557.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.16

File hashes

Hashes for getdaft-0.1.5.tar.gz
Algorithm Hash digest
SHA256 78ef27b62c1d82fe3170f059265122373f169ba320d950163f890840a331a8cb
MD5 e5348f0d2b241f830ee82f32d86cc4ba
BLAKE2b-256 90ee4f25e8f0a8ab46f491ad74854cf30d421a64759e0fe50438e227c08dd909

See more details on using hashes here.

File details

Details for the file getdaft-0.1.5-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.5-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 3ce038f5d8685e92369ca2afe0f64753d01faffefa2aa63e5e37b00822c70adf
MD5 493df322b5b6fc008f26c285d1d35ec5
BLAKE2b-256 79168d7d53201edecbc45ef0b316a92e2bacb5bc5cb97dfc98b86dfa9d8a9145

See more details on using hashes here.

File details

Details for the file getdaft-0.1.5-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.5-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 eee2ee32336dc0e9e2cdb07d4b1f23071817c80dbfe8e4c16f920d8445bac976
MD5 f4d58b4845ba10fa867e06047c473059
BLAKE2b-256 7b9fb49926893245c892f2b1afb5c759f8481f1d54c62a798ee4f6352638fe73

See more details on using hashes here.

File details

Details for the file getdaft-0.1.5-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.5-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 765fc39410e9c6574e4c6c3c6b0a648c6e3b8577044cf02e8bf188f97d60cdea
MD5 dea72145913bbeded470a28fb1e9591b
BLAKE2b-256 0edd028d9be60d3a6c1450306b0801eb149dcfcc441b0b59b854db119f328c08

See more details on using hashes here.

File details

Details for the file getdaft-0.1.5-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.5-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 8205b97fe27efaca64599d522a4c33eb75d966137ffa53a8253d9b3f9268eab9
MD5 d457d30f5d700217478ac3e1be19698a
BLAKE2b-256 1938f408f20718b48d42276f137526f0537cb3eea72a4cc8b7705308dce03e9c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page