Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.31.tar.gz (3.3 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.31-cp38-abi3-win_amd64.whl (20.6 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.2.31-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (23.1 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.31-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (22.2 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.31-cp38-abi3-macosx_11_0_arm64.whl (18.7 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.2.31-cp38-abi3-macosx_10_12_x86_64.whl (20.2 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.2.31.tar.gz.

File metadata

  • Download URL: getdaft-0.2.31.tar.gz
  • Upload date:
  • Size: 3.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.31.tar.gz
Algorithm Hash digest
SHA256 d1402efd6961bc81b33c9d70f459b2f4997769f6f9baf2a0c8abe4b732fb45cc
MD5 09390acad2b558ae271808b3e7c7626b
BLAKE2b-256 16eeec8cd1d7f4d41e88e44123fe528e214ffb0b0ed13ac6d2dfd01e6722317b

See more details on using hashes here.

File details

Details for the file getdaft-0.2.31-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.31-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 20.6 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.31-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 ff14721eef785ffccd34b56ac06b1a9f4f93defcb9d6e87c9215b06c3170c633
MD5 6874213afbb87c021877511a6c0b60ba
BLAKE2b-256 9a54984d145963f516752dd0660bf56b7534a8c0649d98cba38e70e58d279db5

See more details on using hashes here.

File details

Details for the file getdaft-0.2.31-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.31-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 e1868def8854e954ef3d4162ade002081acd0105780cdea37112852134beb495
MD5 b4f2027b54f2bd1a9e549f9a14569d6a
BLAKE2b-256 aef2cec5f17b3c85f2cba7cae176d81f0fb006488576f1b28f622352b963c049

See more details on using hashes here.

File details

Details for the file getdaft-0.2.31-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.31-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 b78a966b7cccf539b381b4fa416c3d1d2099d752f07f14a2d22366b574b804bf
MD5 58b985daa41edcb22df3c513002079c4
BLAKE2b-256 114a5f357db9a2d2c24b23944e1a1b57dc5970bd8b54c690a33ced39cbc49a9a

See more details on using hashes here.

File details

Details for the file getdaft-0.2.31-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.31-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 580b01d16d6491d841fdf14ff8d10aa8e9ec973f3bf636721d24ef18bfde590e
MD5 ea0e003265851f0bf0605c72a61127c8
BLAKE2b-256 d890d3fbf2c13a6b8bee31546de3ee6ffa2d7b457b6f59734ca3d362ae5f8375

See more details on using hashes here.

File details

Details for the file getdaft-0.2.31-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.31-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 a6c07b4d41649bde6b192c10fe6efeb88cbe74e56cb604174d5b29796750ee9f
MD5 2d428a20ed50c53e5b66023e78eab7b2
BLAKE2b-256 bca3aa51aab07dc847810484d44e3a44c4b6b0afb29f55b83e939999698101b1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page