Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.22.tar.gz (1.6 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.22-cp37-abi3-win_amd64.whl (17.5 MB view details)

Uploaded CPython 3.7+ Windows x86-64

getdaft-0.2.22-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (19.9 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.22-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (19.0 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.22-cp37-abi3-macosx_11_0_arm64.whl (15.8 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.2.22-cp37-abi3-macosx_10_12_x86_64.whl (17.2 MB view details)

Uploaded CPython 3.7+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.2.22.tar.gz.

File metadata

  • Download URL: getdaft-0.2.22.tar.gz
  • Upload date:
  • Size: 1.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.22.tar.gz
Algorithm Hash digest
SHA256 23f4a08e3f67d3c309b25ff28cd36509542a6ff8da41e2e4d7fec39c3fbe0583
MD5 f0adc9b783930a7a5d33b37a58ce6048
BLAKE2b-256 5777905d1a0989eefd6277a7ed29686ad7603945f3a8f2a54f5893011f43dcd0

See more details on using hashes here.

File details

Details for the file getdaft-0.2.22-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.22-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 17.5 MB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.22-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 88925b33bad4110bc474d26c599c113a428e4b3b44b2d4b6175f4540e5bed4b5
MD5 cf10752dfb2231155960a6b7dd5c1c93
BLAKE2b-256 349e85c87ed9cf9288930512655ed50791b304d65706116089f62a9798725c68

See more details on using hashes here.

File details

Details for the file getdaft-0.2.22-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.22-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 7772948d547df0f84f25c2ad79ee9a720682a2829ce11bf8450d639cca305568
MD5 ce4cb760611e6272c274652134056ffd
BLAKE2b-256 4bed3b21065a050fbed85a1deee4e7517d2ecf9c5f6cda20c95f20297bdc339d

See more details on using hashes here.

File details

Details for the file getdaft-0.2.22-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.22-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 0057b1f1df8a9bba5f34ef2feeda1f68bc1334e0ada1caa77d9b74dfde288421
MD5 49ee448862479d7e274b63be013e0e3d
BLAKE2b-256 fbf03bca613eff05e01df798fd8a8afe64f0654e6e7abe3a33f34a57c5bfbd04

See more details on using hashes here.

File details

Details for the file getdaft-0.2.22-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.22-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 e2839de4d93dab78ddbb47072e23ad933f5a13d457c04edbb9e0264a1c883b19
MD5 eeac4cd264f05be18ea5c82b214f837a
BLAKE2b-256 76b04cf44fa437f1c31e8419903a8a2dd5cf208cebf6b976662f117efa4c8795

See more details on using hashes here.

File details

Details for the file getdaft-0.2.22-cp37-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.22-cp37-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 c1527704a67541dbb1c7c97a1a121bb50885bceabd5d50e7eec5bbd79364c941
MD5 200bc616158a471583ef73473fea629e
BLAKE2b-256 739ff26fdb29369690a48a39c58fbb011fba2445bbda8d04dbf10236c911ab76

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page