Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.24.tar.gz (2.2 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.24-cp38-abi3-win_amd64.whl (17.9 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.2.24-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (20.3 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.24-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (19.4 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.24-cp38-abi3-macosx_11_0_arm64.whl (16.2 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.2.24-cp38-abi3-macosx_10_12_x86_64.whl (17.6 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.2.24.tar.gz.

File metadata

  • Download URL: getdaft-0.2.24.tar.gz
  • Upload date:
  • Size: 2.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.24.tar.gz
Algorithm Hash digest
SHA256 1fa4eae81ab101bed544ee64e3128e2df4f267a87640cd1473e00f944c32a216
MD5 43781c017f68433267c8db0ac688cb6e
BLAKE2b-256 7268d07f3af9d9d0d119dba7bfb81532022ff85448b46a2e327314f695755cf9

See more details on using hashes here.

File details

Details for the file getdaft-0.2.24-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.24-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 17.9 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.24-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 c77266e55245c95a5c972dd49a47a764cde1b2007cc30ab08c2f25f7a36d6697
MD5 971834abcaa93b2bd9a732b40702ebfa
BLAKE2b-256 862027da9f20566693ae193ac88d7e3f8cbf5a1d069f0331b91add9492bddcbd

See more details on using hashes here.

File details

Details for the file getdaft-0.2.24-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.24-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 473881f9406d166dace7f12a3cb74915f8901b628f6d9f0900fdf69cf05b0031
MD5 013ec0dc764f6cfcd906e5a5d0bc49ac
BLAKE2b-256 a7560711195b42669d4723c987478de6ac0e699652e1cd814d8cf0fb0e1168e5

See more details on using hashes here.

File details

Details for the file getdaft-0.2.24-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.24-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 ae0d0ae1238fa5eb2ddfbefbc52e47aa6f9d00e9621dde0ecbee70be43cee8e8
MD5 94180938d820a056e18ac09d0bb3aa49
BLAKE2b-256 bfc6578ccf561011b274ac6b5956531a6a8b1b4d8a9ab04d7450f530b354325d

See more details on using hashes here.

File details

Details for the file getdaft-0.2.24-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.24-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1c27ff4e3e00275db611c8fad5edefc1a24f8494093ce18f0b846b147b4d6cd6
MD5 205e34048150cfe9983cbe26b729d86a
BLAKE2b-256 e22b6fe68999d93b6d27e873f81fa054a3d401117daa0f833cf22019427313d0

See more details on using hashes here.

File details

Details for the file getdaft-0.2.24-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.24-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 6dbb2c25f14c008fe1323590dc86bbed9d0de8b470aa62c0844bb218864b42da
MD5 b8a657a760c59be4ed665c8b0f50be29
BLAKE2b-256 e18742987172c0dc79c7afcc91ebb1a2855a9a9805832ed207da34f4209bcc00

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page