Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.20.tar.gz (1.4 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.20-cp37-abi3-win_amd64.whl (18.1 MB view details)

Uploaded CPython 3.7+ Windows x86-64

getdaft-0.2.20-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (22.6 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.20-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (21.2 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.20-cp37-abi3-macosx_11_0_arm64.whl (17.5 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.2.20-cp37-abi3-macosx_10_7_x86_64.whl (19.0 MB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.2.20.tar.gz.

File metadata

  • Download URL: getdaft-0.2.20.tar.gz
  • Upload date:
  • Size: 1.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.20.tar.gz
Algorithm Hash digest
SHA256 681f8a11ee4a286ba8de18b18de290f20f987303a87d2031cfcb5636c186d55d
MD5 13e122abfde420a89f22dbce6a982d13
BLAKE2b-256 3eddf4d0e1cc0cc327fdc9370b10c8fc4021719e33c5ed5cee0a35e4ac7229fc

See more details on using hashes here.

File details

Details for the file getdaft-0.2.20-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.20-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 18.1 MB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.20-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 0b3c166d6f8ef59fe2c79e02204ee1d5e66cf6ebcc2068c2519915e3b15c598f
MD5 8ac1a9e910a97fad7afbf7d24aab435c
BLAKE2b-256 308ad30a998ef9ba23850a60cef8f6c1418eaa5ed0cf7cd85eae59ffb64506da

See more details on using hashes here.

File details

Details for the file getdaft-0.2.20-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.20-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 89c11a93a08b3277e3c29ad9e3f2121757f6dfe52dc3f46fefdea4f6a5731c72
MD5 aabeb7d88e201bd09910db658d373938
BLAKE2b-256 20c6478d36494d4914c819f864605070a7a33822a9474cef284bfc77371c1018

See more details on using hashes here.

File details

Details for the file getdaft-0.2.20-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.20-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 5d0ab180e8c7dc1111001fba7fbff8cc75009b7411ce310da971f94b63ceeb6a
MD5 430bf9f8e1ae7fac5ff7f24ebced6a19
BLAKE2b-256 91ef7cc191b8f4a1b8a135d81e911934abac240282ad1dc22a2a144ec082fd97

See more details on using hashes here.

File details

Details for the file getdaft-0.2.20-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.20-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 ccd0237c49ecbd549e0bad57bf7de97e01dd475e3c3c1a2978eefcc4ff0a6072
MD5 c1e12be383f79fcdfe31d0cbe2d63d80
BLAKE2b-256 285abfa6c621aeb526c7f56bf84bc08c09fea98d445fb9454d27123cb7c42b5d

See more details on using hashes here.

File details

Details for the file getdaft-0.2.20-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.20-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 93e621af5be011f58fb9a9ec872c88c8526abb80b8a4910a7434dac92a4f0c58
MD5 b37c3c87d76a25faa0f5188ad3988530
BLAKE2b-256 8c535a3a670ee8ff89a4d53a1d5e723d5d4c4387563a2efe34501f23ceab1f45

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page