Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.17.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.17-cp37-abi3-win_amd64.whl (17.0 MB view details)

Uploaded CPython 3.7+ Windows x86-64

getdaft-0.2.17-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (21.5 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.17-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (20.1 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.17-cp37-abi3-macosx_11_0_arm64.whl (16.4 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.2.17-cp37-abi3-macosx_10_7_x86_64.whl (17.8 MB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.2.17.tar.gz.

File metadata

  • Download URL: getdaft-0.2.17.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.17.tar.gz
Algorithm Hash digest
SHA256 2916687428b39b6a35f810983ba333e2b418d1905c1b7d5baa0f2c59beb5da1d
MD5 289306b283002e5a84f39443d411bc54
BLAKE2b-256 78d6f3d9640fd3882e8b0f52001dfad376d49be203bcd27b32d952cb97447b5c

See more details on using hashes here.

File details

Details for the file getdaft-0.2.17-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.17-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 17.0 MB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.17-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 b44922cdac07f7b6194971e18571b3e61aefa7f84fefedb659f51443db2f027c
MD5 ac0148147bd1545f91567be04bfd60c2
BLAKE2b-256 9508ddd251a57e010a6ee2a6b4b3fda69522d0b245dd670b62eac1ee620d63f6

See more details on using hashes here.

File details

Details for the file getdaft-0.2.17-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.17-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 67f30d77c76288599e730ed8ffc8df1f6551928a62e9198627ef7a94217f28cb
MD5 1444fe27222e4001d7addba161e6f3c8
BLAKE2b-256 85bc87b5363adc16d3bb3d6c35d69777c82620868e29ab00dc8e67e3ea2f59c5

See more details on using hashes here.

File details

Details for the file getdaft-0.2.17-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.17-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 8dc9f49a055add1b14150a3c8b006b181b67d1edda706b15c69b3a73c92657cf
MD5 9636d27f766501afcfc38a266b1b98b6
BLAKE2b-256 fd1d856b40a53cfdd01f24c65cc4ff4487208b14a74847a27bfb8d2d24f1314c

See more details on using hashes here.

File details

Details for the file getdaft-0.2.17-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.17-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 766655e2c0d738184ccfe6079aad08653fac4d7a422d1f539e5e48835e9d0ad0
MD5 b68f333fd450aa5ce0a1930df8e3d002
BLAKE2b-256 e82c6a843b8292e76001c786acab376cc6bd873eb12d45360905a099346235d8

See more details on using hashes here.

File details

Details for the file getdaft-0.2.17-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.17-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 8661abdb1cad6edeffc8350a251821820a5ece668a3d56a8e2ad8fd5bc839857
MD5 7b250fa760a94536c2bc5653ee3c5bb0
BLAKE2b-256 3d45acff6a198ca937fd4bc5d8c4d97dd9c2ba51645e4e2e3a6e5e0c7ed1d21c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page