Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.21.tar.gz (1.5 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.21-cp37-abi3-win_amd64.whl (18.1 MB view details)

Uploaded CPython 3.7+ Windows x86-64

getdaft-0.2.21-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (22.7 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.21-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (21.2 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.21-cp37-abi3-macosx_11_0_arm64.whl (17.6 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.2.21-cp37-abi3-macosx_10_7_x86_64.whl (19.0 MB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.2.21.tar.gz.

File metadata

  • Download URL: getdaft-0.2.21.tar.gz
  • Upload date:
  • Size: 1.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.21.tar.gz
Algorithm Hash digest
SHA256 9b35394619817557eddcfb5402d93478b3a1ef4bf245bb2735106d4d304de98b
MD5 8d2e347b6ecdc48a3efd867cb1536ecc
BLAKE2b-256 8298ada9aa933dfcfb683f6be7909099997446bac4ac76b53c3b821c0c40751c

See more details on using hashes here.

File details

Details for the file getdaft-0.2.21-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.21-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 18.1 MB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.18

File hashes

Hashes for getdaft-0.2.21-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 6e27f8f2e49ccbc9e24a43737965bc5388272fe27d9c947ee925ec928108fa88
MD5 9e66e064b4019a37ac91c018afb65ac1
BLAKE2b-256 9b969cb07606f0ed2cdbe2198efb99c1a47f8874b4f3bc0239898ebba41edc09

See more details on using hashes here.

File details

Details for the file getdaft-0.2.21-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.21-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 38bbed6996e366862601b1e2e3651ec572ccc5e23fce1bc4a1d647daef59e14d
MD5 149fe5f919b79c8a61782428c2b2d7aa
BLAKE2b-256 3df866ebb2a363eeb57fd706a3f218e0c3b59e38e31a18997426ab31821d956c

See more details on using hashes here.

File details

Details for the file getdaft-0.2.21-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.21-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 31954f6ec98e888280c6d966c2c0a427ddfb25d6dc528612f593cc634ca19806
MD5 c606d5161bfd2d33be48ad0894c44dee
BLAKE2b-256 31f09fe658c22847073dffb04d265d215abebb384f6e70626257c96d626c9622

See more details on using hashes here.

File details

Details for the file getdaft-0.2.21-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.21-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 a1b928c9dfb499e784fcc3563403b4d586cd5075c81aa39d73872fb99a8f88a1
MD5 e97a8441934e3c8c3b6c009a0d2fe092
BLAKE2b-256 59598bc3427702afc132217dd8e292b090095ec77d9d0970e08b699a6091e64f

See more details on using hashes here.

File details

Details for the file getdaft-0.2.21-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.21-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 86ca6c4b57de1c66926bfc7346b46eff87eea3417acfccd6487f39c40930cf29
MD5 09f2fce89193d7521c77492cdffaf253
BLAKE2b-256 9b42ddda600b67c06290d6b0045ada8790ce91af197d1e6869357b0ce19c7112

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page