Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.28.tar.gz (3.2 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.28-cp38-abi3-win_amd64.whl (19.3 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.2.28-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (21.8 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.28-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (20.8 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.28-cp38-abi3-macosx_11_0_arm64.whl (17.5 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.2.28-cp38-abi3-macosx_10_12_x86_64.whl (19.0 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.2.28.tar.gz.

File metadata

  • Download URL: getdaft-0.2.28.tar.gz
  • Upload date:
  • Size: 3.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.28.tar.gz
Algorithm Hash digest
SHA256 1389ef47caa61f0daf3217b4bd5042b50e854bfb1315b104341110c09a6c072f
MD5 3d9fe4f729cec2ba0f57b705d81ee16c
BLAKE2b-256 90967e7027b92759863add7aeb5789b300551cb0a4e39f640c1c556f5d6c633f

See more details on using hashes here.

File details

Details for the file getdaft-0.2.28-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.28-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 19.3 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.28-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 679a9d26f76f695f4fa3c51c732c02f511eeb5a832b305bbd237c2e62333f815
MD5 3d05a11775eb4116e7d43ae72614d138
BLAKE2b-256 cca78c9911df1cb8d8b0da497e37e859d7f31f4c345541bd3aa7cc8a77190b3a

See more details on using hashes here.

File details

Details for the file getdaft-0.2.28-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.28-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 da7be6b900798061090ea99f474ad1d128fb493f958c17854248eacfad68a969
MD5 10aecc12fb100d03f0c2709294f34440
BLAKE2b-256 5c30e4d56a7812525dd67270394cc0b09ea1d3a2af0661358acaa905bb6b260c

See more details on using hashes here.

File details

Details for the file getdaft-0.2.28-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.28-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 477d77f695129843d1bdfe3896d17cd5af43024e06c1956077f6afe2069e4dcf
MD5 e3f673f6d3d5b42e1e586419f2c2f488
BLAKE2b-256 d635d5dacbe4247e15d88a0eb3c16c10767ae4b59be8d669a39945fed2146b2c

See more details on using hashes here.

File details

Details for the file getdaft-0.2.28-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.28-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6d120504f05dadac6fa0c170558f2635e5654d1e49ffcd95c20952847427e069
MD5 f27fde6dd4237dbfbb46766dab7325f2
BLAKE2b-256 e95e5b44e135127bd4df2a3ce5b59725d8a58fe3f4726e6e4a07030d6d7eb0e8

See more details on using hashes here.

File details

Details for the file getdaft-0.2.28-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.28-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 052632bf156dfabc61b00bc3e055f11c045ed1011818ed398e82bee549346510
MD5 dfbb332aebff925784c70171b1df8d21
BLAKE2b-256 965b324a5f0d3a751542754deac605fc557564cdfdf941448c3093d20b3f5384

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page