Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.3.0.tar.gz (3.6 MB view details)

Uploaded Source

Built Distributions

getdaft-0.3.0-cp38-abi3-win_amd64.whl (26.6 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.3.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (29.2 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.3.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (28.0 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.3.0-cp38-abi3-macosx_11_0_arm64.whl (24.3 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.3.0-cp38-abi3-macosx_10_12_x86_64.whl (26.3 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.3.0.tar.gz.

File metadata

  • Download URL: getdaft-0.3.0.tar.gz
  • Upload date:
  • Size: 3.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for getdaft-0.3.0.tar.gz
Algorithm Hash digest
SHA256 6743b12bfbcc241db56ac89c71ef1857722c5f8ed120311bc434006a8423bdee
MD5 b72d2970a8e503fc30c6846e75d379e3
BLAKE2b-256 27fca71c5e5932d039be3cfc3f96088c09c7c8522b40c6cb6a8c88e932137c40

See more details on using hashes here.

File details

Details for the file getdaft-0.3.0-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.3.0-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 26.6 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for getdaft-0.3.0-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 3cc2b19b3caf2ffcf488ed0bbbbc494f64b4a55c1ce0c0dcc2ab06a14eb055da
MD5 37144beb74372a34660f95f64a05ecd9
BLAKE2b-256 6344d6bbe5911582bab473480c259a87dd5745ee978dc1d051ec4bd462f3cb7d

See more details on using hashes here.

File details

Details for the file getdaft-0.3.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 2bee8ff4f340ab9d869a01a7b633d20c1c7061beacbe93583e339431b3f0fd6c
MD5 f7c787b00a5270866339675f388a2767
BLAKE2b-256 b9b39ccda7562e48c0795f9ce0a6cc360c7e2f7987fac32e60e16fef69e4693e

See more details on using hashes here.

File details

Details for the file getdaft-0.3.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 2585c132fb6a2c11399d63b79444d3d622a8d8d0f9614a2f1dad376a2fba9224
MD5 7e4228fa19762a5a7c44b4702e34cc06
BLAKE2b-256 f2be3ffe28f3fab3880aa7dc12a0ddcf3485a09c5bd1be7d31ce2967aea79cf2

See more details on using hashes here.

File details

Details for the file getdaft-0.3.0-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.0-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 aa5169f116c474c86b07c00d1105854c12c0f43db6455d299e65374f07d26dbc
MD5 f1ca220640282d3939114e9169b53aaa
BLAKE2b-256 c8b081d337016e506ef45ff67445682a1d8601f231cfb6fcb430565ba5f9f0c5

See more details on using hashes here.

File details

Details for the file getdaft-0.3.0-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.0-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 1912c3498e79de21f58d9481a7ec2f1e5086d14668a68f09c947612fdae4367d
MD5 6695d3644420d21b6a631770d232c00b
BLAKE2b-256 e8d449d9c10795f248f8a60a687e53bb13497bcaf4f4ea01dc644ddbc7c6a210

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page