Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.14.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.14-cp37-abi3-win_amd64.whl (16.8 MB view details)

Uploaded CPython 3.7+ Windows x86-64

getdaft-0.2.14-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (21.4 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.14-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (19.9 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.14-cp37-abi3-macosx_11_0_arm64.whl (16.2 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.2.14-cp37-abi3-macosx_10_7_x86_64.whl (17.6 MB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.2.14.tar.gz.

File metadata

  • Download URL: getdaft-0.2.14.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.14.tar.gz
Algorithm Hash digest
SHA256 07cd349fe961536c6bd172b6cf02217fb1fd23a2a021571f6a5c6f0dddb184ec
MD5 0dc91918bbb4547dfebb0135b1bd94f4
BLAKE2b-256 fee936b06fd8c9af53140e7a362155f9d8ef146c991dc7708cfbfb7f315985af

See more details on using hashes here.

File details

Details for the file getdaft-0.2.14-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.14-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 16.8 MB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.14-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 34ebbbf040982f219c3bae5e2046322150332a3e65ad67ba71985db59b2fff68
MD5 3d55319077754abba9b2629a2fdc2d1a
BLAKE2b-256 91bd5e87d1a91138692c335292faa37b6e8748b222a98c0590938e156787d643

See more details on using hashes here.

File details

Details for the file getdaft-0.2.14-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.14-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 110b0b35bc9732926f9a8b9c3f6996d834a0f24ccb0c400f9b03d580dcd90096
MD5 3b47273e204649715cc95363b0d75b81
BLAKE2b-256 b873d6895c444cc57ee0a6c57adddc8ac2f2c20c5083aedc2252ce0d2b1a4eb7

See more details on using hashes here.

File details

Details for the file getdaft-0.2.14-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.14-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 3add9d73702765e261d3e88ef04099d6a9c95f90d19c96ff4e014f4d9ed6433e
MD5 994942f70d59fde7c995eb95f11fc1c5
BLAKE2b-256 4ee9c288d1bc6a454c9e0c4b67882e3ddd3662e22b1f9d8aeac717f187d8c78e

See more details on using hashes here.

File details

Details for the file getdaft-0.2.14-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.14-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 8cf445538b7e0d5016c548b0e951dab3f5cf2fe6303bddd09d9c5ff5747894e1
MD5 454fd4f05ced99564206539f01e1bf7d
BLAKE2b-256 c3737612dfa27a37f0f6345540800fce41e1c31575ac3780d33b3d98b1c3620f

See more details on using hashes here.

File details

Details for the file getdaft-0.2.14-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.14-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 01754aa7f6059cfec363eff9a8e46197bbbc65634da9c12cd998ce2223fb8c45
MD5 95da52a88a46afe1731f1a0834b02a1f
BLAKE2b-256 dc9a03c977089393e47f6940f5488944ec57f855d839ca18d7418f489171c207

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page