Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.16.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.16-cp37-abi3-win_amd64.whl (16.8 MB view details)

Uploaded CPython 3.7+ Windows x86-64

getdaft-0.2.16-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (21.4 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.16-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (19.9 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.16-cp37-abi3-macosx_11_0_arm64.whl (16.3 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.2.16-cp37-abi3-macosx_10_7_x86_64.whl (17.7 MB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.2.16.tar.gz.

File metadata

  • Download URL: getdaft-0.2.16.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.16.tar.gz
Algorithm Hash digest
SHA256 3fc7b2c3373bc374a90ecc566c6f0d830b9ce751d6c930c96b70b2c4c2afa0c4
MD5 a7ce29d6355b23e81c5c84753288777a
BLAKE2b-256 23455556283697e565e5f381970e825e9ee7a48ba8850b6b1da6fbc9f47df87e

See more details on using hashes here.

File details

Details for the file getdaft-0.2.16-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.16-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 16.8 MB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.16-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 c3e74f56b211f88e5c335276fe4670a0dfac8dc8b5c684b22fc570b1350cc40d
MD5 3863e93a89cdef68918f893994560b0a
BLAKE2b-256 0a1f58f101c13747f1b8dc99bb1aa3ad067ae716bc9f99b85612a7782710c5d5

See more details on using hashes here.

File details

Details for the file getdaft-0.2.16-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.16-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 b932510fd24b4f1f021abd67016bbcdacda3315d0e3ee2a8e339d82719adbd51
MD5 fe6b70a582c078ae9a809d5fb1226c05
BLAKE2b-256 aa1c79df028123eb3f7e686a808d4e403dc114e035c8da2a6fb1b6bdc164ecf9

See more details on using hashes here.

File details

Details for the file getdaft-0.2.16-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.16-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 1cb1e5a62fcbb4a909532bb64dc7af56e7ac3fef1b8220448fcae1a8af0c6bc4
MD5 de555520dea59a53e1e0dcd98a41cb26
BLAKE2b-256 52ad4a6a428c371cd82361f216f91e6342239d2ce7ec47a90b5ec1ea2f080b31

See more details on using hashes here.

File details

Details for the file getdaft-0.2.16-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.16-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 317a8dff8169638cea40efbc01193d51f31c4ab441fc39f01f163f197fc264a2
MD5 cf8feb10f9f3c6550eae4f06edf5a7f4
BLAKE2b-256 e49fd7e3fefab59fb65ac5d67884f5b0f63016010983cb8758ad1cb64cd93e45

See more details on using hashes here.

File details

Details for the file getdaft-0.2.16-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.16-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 0a355301e79e00ab639150b84d380465f5f69ef9e6f36f1b5cf376e3d24229f6
MD5 0b810dd45db27c61da0e729a7c61be75
BLAKE2b-256 a530f95e0d52844ef63b9bf8baf4b2f887d14b39bc8e2c4ebba78eeef679d241

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page