Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.15.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.15-cp37-abi3-win_amd64.whl (16.8 MB view details)

Uploaded CPython 3.7+ Windows x86-64

getdaft-0.2.15-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (21.4 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.15-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (19.9 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.15-cp37-abi3-macosx_11_0_arm64.whl (16.3 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.2.15-cp37-abi3-macosx_10_7_x86_64.whl (17.7 MB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.2.15.tar.gz.

File metadata

  • Download URL: getdaft-0.2.15.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.15.tar.gz
Algorithm Hash digest
SHA256 5729915db8e15b6d42568cceef4a588ecdc5ce1a29f52c362e41d789bffb32e7
MD5 f7fb6419970fb3a86923f99d699d3419
BLAKE2b-256 b2af82bdf9c06ed6d7abd95c25f53fec93527376f4b7a3117173093f16d36f06

See more details on using hashes here.

File details

Details for the file getdaft-0.2.15-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.15-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 16.8 MB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.15-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 10103355c8a48455a1b2262bc1b7eca6b495059da7f2d220758bc273b734b898
MD5 e7d30f8086ac0ae4fbf015b5d674026f
BLAKE2b-256 c22837ad53d021c1a983ad7455b98b20a5c99574ce6a621afc51e96d9d9f7e08

See more details on using hashes here.

File details

Details for the file getdaft-0.2.15-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.15-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 44424e2adc80f12e3a404cc389a1b37eabbd1c8a3f0345d218852a65a7c3593d
MD5 353aa5fc57adbb4f730af02849b8b2fe
BLAKE2b-256 be02aa5b1de8042dd0a9259cfef8f39ccad6470d712cbb026c41163eea6ad658

See more details on using hashes here.

File details

Details for the file getdaft-0.2.15-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.15-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 580a9971807e30a21136ae10eeb39cb2c880ab6eb87a464447206e4d36d52e2b
MD5 a2fba1e0697b9b046eb39d6e802dc5d0
BLAKE2b-256 10ad6c0963348a419fb091f1e988384ed70598a75af7b71d39ddd95f6d326dd8

See more details on using hashes here.

File details

Details for the file getdaft-0.2.15-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.15-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6afb66507ae899fb32adc8532b25ddd245a14d695a099ceb594afcb24848adb0
MD5 2b2c0fe30b511f44a7532710900a95d3
BLAKE2b-256 8a11dafdd3028a1491b772e458b39ed4f636b4605ec4be2031b880e6afbb8d11

See more details on using hashes here.

File details

Details for the file getdaft-0.2.15-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.15-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 95c16b0f25a78a13cab21128bcd0b5e7af65a14b12fd037ef8ec8aee5b7ac911
MD5 a3fd6c78bdd1b7df378536b66653f74e
BLAKE2b-256 da93c5c70b0481e7c336dccb55b1e0c9885b2f25950011a4f3a2df0d2f71ca7b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page