Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.19.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.19-cp37-abi3-win_amd64.whl (17.9 MB view details)

Uploaded CPython 3.7+ Windows x86-64

getdaft-0.2.19-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (22.5 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.19-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (21.0 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.19-cp37-abi3-macosx_11_0_arm64.whl (17.4 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.2.19-cp37-abi3-macosx_10_7_x86_64.whl (18.8 MB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.2.19.tar.gz.

File metadata

  • Download URL: getdaft-0.2.19.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.19.tar.gz
Algorithm Hash digest
SHA256 d8a0f050257eb910358682cf4565309a38be762f928a4ce99375b6bd7c06cadf
MD5 a5191d2d47e2f0bcd8620776d2d1cee1
BLAKE2b-256 ed987c31e85853539880bac52b6c99dbf6a0f2a47b9767551035edda77644a86

See more details on using hashes here.

File details

Details for the file getdaft-0.2.19-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.19-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 17.9 MB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.19-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 95a34fb500646f1a77933fd8e9aa690effed21bb059fc40a8ef4a95785fa3cb8
MD5 017c1be77bbf391099abe4ffac5396de
BLAKE2b-256 8ee8fff0c10ebdc8bc9c4f7eea1dd534756e6fd835016fb771291fc743b23073

See more details on using hashes here.

File details

Details for the file getdaft-0.2.19-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.19-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 857c4ae865026d87ca4ead652d71b3eb70062ce4d5a17c30a55afd720024360f
MD5 558f894f3ad2fa7a136d1e43ee6368c7
BLAKE2b-256 6e8a77af4bcbed5e79bd4e89ed94617ff62ee4d83cc0e9f26029e658b122347f

See more details on using hashes here.

File details

Details for the file getdaft-0.2.19-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.19-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 6913e36f1acaa31fbde172215e02119eb2a5bf845c88e29f4024e4c1070d6759
MD5 40508db067d8470f8f6c0da44119e992
BLAKE2b-256 b29e375e56377a97d204f5135383195a1d1e17c16eec95562d7e63498ecf5e87

See more details on using hashes here.

File details

Details for the file getdaft-0.2.19-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.19-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 ac0328427b561c211465b38904a45c1e25d6fd3ac3e210abed6e5e5effcc76ed
MD5 01c8695e208c37c8e69a141342a4a727
BLAKE2b-256 ffec3b553695ab97422b31617bc91691130613bc01e49da7784c0c225a2849fb

See more details on using hashes here.

File details

Details for the file getdaft-0.2.19-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.19-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 34dbe4153ca8b0d8d9dce863ee025e492c448b60c162f2f7f6efbe753094156d
MD5 9ee71ce0d2c8fe52a0834f28c7dd4b93
BLAKE2b-256 f12683b82f13e023288dd47d22725670bd8ac18ccc2278f6e0071e69654a43c1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page