Skip to main content

A Distributed DataFrame library for large scale complex data processing.

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: the distributed Python dataframe for complex data

Daft is a fast, Pythonic and scalable open-source dataframe library built for Python and Machine Learning workloads.

Daft is currently in its Alpha release phase - please expect bugs and rapid improvements to the project. We welcome user feedback/feature requests in our Discussions forums

Table of Contents

About Daft

The Daft dataframe is a table of data with rows and columns. Columns can contain any Python objects, which allows Daft to support rich complex data types such as images, audio, video and more.

  1. Any Data: Columns can contain any Python objects, which means that the Python libraries you already use for running machine learning or custom data processing will work natively with Daft!

  2. Notebook Computing: Daft is built for the interactive developer experience on a notebook - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Rich complex formats such as images can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

Quickstart

Check out our full quickstart tutorial!

In this example, we load images from an AWS S3 bucket and run a simple function to generate thumbnails for each image:

from daft import DataFrame, lit

import io
from PIL import Image

def get_thumbnail(img: Image.Image) -> Image.Image:
    """Simple function to make an image thumbnail"""
    imgcopy = img.copy()
    imgcopy.thumbnail((48, 48))
    return imgcopy

# Load a dataframe from files in an S3 bucket
df = DataFrame.from_files("s3://daft-public-data/laion-sample-images/*")

# Get the AWS S3 url of each image
df = df.select(lit("s3://").str.concat(df["name"]).alias("s3_url"))

# Download images and load as a PIL Image object
df = df.with_column("image", df["s3_url"].url.download().apply(lambda data: Image.open(io.BytesIO(data))))

# Generate thumbnails from images
df = df.with_column("thumbnail", df["image"].apply(get_thumbnail))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.0.21.tar.gz (625.0 kB view details)

Uploaded Source

Built Distributions

getdaft-0.0.21-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.0.21-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (1.5 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.0.21-cp37-abi3-macosx_11_0_arm64.whl (589.8 kB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.0.21-cp37-abi3-macosx_10_7_x86_64.whl (610.8 kB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.0.21.tar.gz.

File metadata

  • Download URL: getdaft-0.0.21.tar.gz
  • Upload date:
  • Size: 625.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.15

File hashes

Hashes for getdaft-0.0.21.tar.gz
Algorithm Hash digest
SHA256 6260db85e4af4963745cd1e18f8b4c2710f79dcf6293d83bf15392e82f6711e3
MD5 d69e55bb9456e252c55ddae9673a9206
BLAKE2b-256 e67fc54a3f632f95cd8877cd00cc531213dd7961ec0d1a25d5b146cac0a138d9

See more details on using hashes here.

File details

Details for the file getdaft-0.0.21-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.0.21-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d879a18295a0e7e795c221e55931d89a199ca31e1d989faf03ff906cba3652f1
MD5 7ac2b406d3312625ff6b254f96f81cd9
BLAKE2b-256 113706076266886cf9933649da8553e93840422cc29233c342fa61dcb9eb1e47

See more details on using hashes here.

File details

Details for the file getdaft-0.0.21-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.0.21-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 afe9795ece7435afbb757c1201cd1c8a45a66a68450ec06d601a287ddc6b25a4
MD5 0369c4356591135d27660c8f9311c2f0
BLAKE2b-256 3572e0f3a430761fc76c7487633d41a05ff47c4a6ab8f1dd5505f3e55f25b41d

See more details on using hashes here.

File details

Details for the file getdaft-0.0.21-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.0.21-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 5fa830a8434b8770bfd2b5fdabc3092516a899103ec8e3609300e402bb024440
MD5 cc7e603016985c642f7fc5079e14b9a3
BLAKE2b-256 96c2a19bc06c70569eb47c1f9b32ffd4a7e20fdbca75c67951e9fea8eff5af5f

See more details on using hashes here.

File details

Details for the file getdaft-0.0.21-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.0.21-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 e3de38009e0f9fa803bfe4d535c506169bb882f65fd78505a2a9b1fa6c8beda5
MD5 f2055c7a4e2b81d2d9cd822e53fa6ce9
BLAKE2b-256 acfe1e67611263f63417000c9d7d4d21d46e526bd163e728cfce87a1475e228c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page