Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.29.tar.gz (3.3 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.29-cp38-abi3-win_amd64.whl (19.6 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.2.29-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (22.0 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.29-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (21.1 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.29-cp38-abi3-macosx_11_0_arm64.whl (17.8 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.2.29-cp38-abi3-macosx_10_12_x86_64.whl (19.3 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.2.29.tar.gz.

File metadata

  • Download URL: getdaft-0.2.29.tar.gz
  • Upload date:
  • Size: 3.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.29.tar.gz
Algorithm Hash digest
SHA256 5b69a952811389a4dcb5f55eabd68b2c868a45094e75363e073d547b505b932d
MD5 777ed2a13276e8deb83d6d5e87f7deee
BLAKE2b-256 bcb1d8793e5d34cd850a5699739a1603b3f8efa49c28e087abffbaa5338e5148

See more details on using hashes here.

File details

Details for the file getdaft-0.2.29-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.29-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 19.6 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.29-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 849f02333145f4814b628c6998fabaf9d361f9cab8ea043eddd6bdb0d63d0088
MD5 408330e240eb21127ac3cab6e7933707
BLAKE2b-256 cef3800a8de9133c3d8e5ff4f8dead09330f956cfd263057ee957f606ec03b98

See more details on using hashes here.

File details

Details for the file getdaft-0.2.29-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.29-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 bd904a5b0c592f853e612f6fcf8378b7175fc466187967138c3e23811fa63c21
MD5 e42a1348272f8fd0c476a8cd89b75def
BLAKE2b-256 30c3644fa11f74998c893ad2731341b7e6c45b7f6ad0a30d9b50430d688551f8

See more details on using hashes here.

File details

Details for the file getdaft-0.2.29-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.29-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 ed828e1c87ce6f3b68d7ad7ad2aa3fa7604175d1ae2245459a3793297ecabfe0
MD5 8560ca370472867bb08e9f40dd1cb029
BLAKE2b-256 01fa545de6e14cda7daffa9a233135e4d33680bb13dfc44b3531569dd57c6642

See more details on using hashes here.

File details

Details for the file getdaft-0.2.29-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.29-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 bf2f3a6f17ff405c262272f3d232deea729dbb59bb3209e2544d63d82b43efb8
MD5 f2eef461f8909bd667c0a53a6a6624ec
BLAKE2b-256 10b3e5ff3ac10b1c26eebb56d94bbb79e4e0a97ec4d162b083da455dceedb21a

See more details on using hashes here.

File details

Details for the file getdaft-0.2.29-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.29-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 358ad9b907f49597bf14cd785a7ec224529919a42353e9d0c1746543e0aee331
MD5 ee84c8784f53af44949ca55a578a7ac4
BLAKE2b-256 556a8f1c5b968b4186646349063db408ce26f299aaf23f5ad2a373938655a6b9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page