Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.27.tar.gz (2.6 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.27-cp38-abi3-win_amd64.whl (18.9 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.2.27-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (21.3 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.27-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (20.4 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.27-cp38-abi3-macosx_11_0_arm64.whl (17.2 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.2.27-cp38-abi3-macosx_10_12_x86_64.whl (18.6 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.2.27.tar.gz.

File metadata

  • Download URL: getdaft-0.2.27.tar.gz
  • Upload date:
  • Size: 2.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.27.tar.gz
Algorithm Hash digest
SHA256 fcb62ddc260c7a8ac8cfaada87d5dd38b46886b02d9b8fe57a27d2aa176325d3
MD5 03e21687a2065cb0b2a7825116346bc1
BLAKE2b-256 185209a1324009d63f4fdf10c6dfc61e3b37d3691f738f1e324a0836a72bb002

See more details on using hashes here.

File details

Details for the file getdaft-0.2.27-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.27-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 18.9 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.27-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 9eba98926f7fac3e15d63a82a2b510afae454e6e6e509e2026aeebe3a3f74b3d
MD5 7d01220a6636641d9c82b6a1fb468d75
BLAKE2b-256 15050db4d1a1f14ddf3f49c657f26483b6ca74e031c314020cb49b21084907f2

See more details on using hashes here.

File details

Details for the file getdaft-0.2.27-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.27-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 de90e30ebd727423afe32cd2333a7bfa4fceff6a5cc69e3af3839af37f0afdd7
MD5 56141ee223c259d89d5f31f728723f16
BLAKE2b-256 5a0952e176a7e8919a30b37646cffb3ebd21c266a49fb7c866100dc4f84f797d

See more details on using hashes here.

File details

Details for the file getdaft-0.2.27-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.27-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 f71bd99964105dc8464fe568c53464f6a44db116bc743cdbc7a5cc83fb126318
MD5 3b65b2b68ea87611158478cd8b2cc64e
BLAKE2b-256 17877c4c397ff42591b7206c2767fc6f739dd8adfc8950dde3a900cab9dd6466

See more details on using hashes here.

File details

Details for the file getdaft-0.2.27-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.27-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 2d31c0ecb211e8801c158702c53659be13db82c1656aac67cdaa4f8dad6e29e9
MD5 c216af1fd1081298ae09404e909701d1
BLAKE2b-256 99d635e47bb78002d2693bd90ef2ea1dfae38644cce443437cf3dedd019cfc48

See more details on using hashes here.

File details

Details for the file getdaft-0.2.27-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.27-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 13f75cd4fa5037760757743fbd04fdcdf5c8294dd7975cc369081f9a2c53e49a
MD5 cd7c5e54d02c083d3f13bf2a84c72e2d
BLAKE2b-256 1ae3a6aa898ce6db5023ea61021c444a0dca17e8b75c5e0fd0b24e1130ca4e4b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page