Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.18.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.18-cp37-abi3-win_amd64.whl (17.9 MB view details)

Uploaded CPython 3.7+ Windows x86-64

getdaft-0.2.18-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (22.4 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.18-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (21.0 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.18-cp37-abi3-macosx_11_0_arm64.whl (17.3 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.2.18-cp37-abi3-macosx_10_7_x86_64.whl (18.8 MB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.2.18.tar.gz.

File metadata

  • Download URL: getdaft-0.2.18.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.18.tar.gz
Algorithm Hash digest
SHA256 050bff56b2de21b54c3207862d97f474632350f3e5edc72e378ed9004c9b7896
MD5 9a083aa7271ce369952939cfeb20cdb5
BLAKE2b-256 1eee691d58a7aac58258a73879427583eaffcf2ca6aa51358c7e586f8a858588

See more details on using hashes here.

File details

Details for the file getdaft-0.2.18-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.18-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 17.9 MB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.18-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 181ae2886e0d9f1fee945bd613cd381cb43e8502ecd0a5128691f63772f595ad
MD5 5d9df69262b23b30a3fba730792d33c0
BLAKE2b-256 4c736e893027f10d55e2c86ef501a9eda5c6308b9ceed98559a66f707fa6b130

See more details on using hashes here.

File details

Details for the file getdaft-0.2.18-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.18-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 68521d84d1e32606e3266a5a6723cefb530563d27437edf1fbc15a86e64298e1
MD5 08e59455061132b7b4ebfe07f91d07b4
BLAKE2b-256 baa0cbe435f01b4b327bd327314c49bcad9aabdadb19672525a89fff7a3ecced

See more details on using hashes here.

File details

Details for the file getdaft-0.2.18-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.18-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 3096ecf9c06151ee61a97c6e5034dade6a28ce6e419b0ba7f2c8115c725cfb0f
MD5 bb167f8b9a576ac4c076750b96402f45
BLAKE2b-256 59f741dc796c80728c6bcdd243b29d69ca02c7c8cfffbb04277ba9ef82ff2bc6

See more details on using hashes here.

File details

Details for the file getdaft-0.2.18-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.18-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 ec20083257cf20e7f0b255f045f449a803aeee850a04eedd51148d8cfbfc9fbf
MD5 a3500b2dda62f8b059a1d8ea38b272aa
BLAKE2b-256 ec33dcc6b2b964a53f38101bc6e5bb288881d6b0082a33510fa445fa31754a98

See more details on using hashes here.

File details

Details for the file getdaft-0.2.18-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.18-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 546a9b78919510889a895835dfb646d56d967b66c55873fe00867548392629d2
MD5 ed2e82ee7b7112255471c58caa642413
BLAKE2b-256 95f1922522d7ac56e0ea3b0a476f9b708456c6f607c3f1a44789c884df4f4385

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page