Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.26.tar.gz (2.6 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.26-cp38-abi3-win_amd64.whl (18.8 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.2.26-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (21.2 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.26-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (20.3 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.26-cp38-abi3-macosx_11_0_arm64.whl (17.1 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.2.26-cp38-abi3-macosx_10_12_x86_64.whl (18.6 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.2.26.tar.gz.

File metadata

  • Download URL: getdaft-0.2.26.tar.gz
  • Upload date:
  • Size: 2.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.26.tar.gz
Algorithm Hash digest
SHA256 95f0223fa31aca1aabdea09a97f09e6a530fca0e2154e028a26986b4791daf45
MD5 bacee8518573179f70746159e13952e8
BLAKE2b-256 881d50f0143b711f6726f4081c9645f44d59534f8e66414efe8bd513b58ad173

See more details on using hashes here.

File details

Details for the file getdaft-0.2.26-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.26-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 18.8 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.26-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 9708b66f7735950fa9ee04dd67864ff60855dfb723c6478eb78a50410ac52b77
MD5 11c3f471ef53dc98e09d4ffc624459c2
BLAKE2b-256 ee6319302cbf381ffafc30856eb046ac4c3414b9a386e6f090c4e6532fcd6e0c

See more details on using hashes here.

File details

Details for the file getdaft-0.2.26-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.26-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 fe32ef378468925620fe8b9c1a32dade803ebf6bc054afb4c1de11c70344ec4a
MD5 bd2fb9ac57d54d190348a079cb955f3c
BLAKE2b-256 74c7974da59daba067858f099112eedd4e0bedcc347b789192aa568af470ed8e

See more details on using hashes here.

File details

Details for the file getdaft-0.2.26-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.26-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 bdbb9346aa1c814ba9894d972d2d0d3b2d78a1cee09724ef12f060ccf4d5307a
MD5 7448e8045e91badf5d7dad17750d1ab4
BLAKE2b-256 9c70a3038d71794d9a75f4fb2ad558aece626fbe786fd2d733cd83b5d639b2d6

See more details on using hashes here.

File details

Details for the file getdaft-0.2.26-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.26-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 0f547fd4a605c72c22879654adf1469bef45af38a365f810f9437486eaf86703
MD5 caefe3e9eed601c7d6097e9d0828b86a
BLAKE2b-256 7aed4fb700b411ea016a97b2ab5b9098b37380aa90f6c05e89c513b53f0947fe

See more details on using hashes here.

File details

Details for the file getdaft-0.2.26-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.26-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 2279c29ec3d27c0e1dd6f06216ce4c0e2b2efd639af3a0c09ae48416f66ca89a
MD5 72bef4e17c17bd74f22350f1e333147f
BLAKE2b-256 0429a89fa0dd12a7b20d26811addf6a59d3a8b9c3d28af650bbe54207117d3dd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page