Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.30.tar.gz (3.3 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.30-cp38-abi3-win_amd64.whl (20.6 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.2.30-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (23.1 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.30-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (22.1 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.30-cp38-abi3-macosx_11_0_arm64.whl (18.7 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.2.30-cp38-abi3-macosx_10_12_x86_64.whl (20.2 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.2.30.tar.gz.

File metadata

  • Download URL: getdaft-0.2.30.tar.gz
  • Upload date:
  • Size: 3.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.30.tar.gz
Algorithm Hash digest
SHA256 b75f4fab392940dbe5eafb1a3c48e4993558b9b5796cc0fdbccf8a7d4510a874
MD5 c0d3dd44060eaa43962d548517def0df
BLAKE2b-256 fef89e80ec3464690fa84a79a94570a2c40e5c5c3ee69060af89cb5f54893d8e

See more details on using hashes here.

File details

Details for the file getdaft-0.2.30-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.30-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 20.6 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.30-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 1d9d9e01fad03ce34e44b7299a8286b2322470acdb4cb0523ae63649c124eb3d
MD5 316b7a6ae93e0dfcfd6ce1905bd5128a
BLAKE2b-256 58b2800666e371fda45b73347855eb9d3d9ff6579fa3c520d8caad36b183d123

See more details on using hashes here.

File details

Details for the file getdaft-0.2.30-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.30-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 8bd297ccf2bb072fccd7093361b976f2d3b93fa369222cd207603c9cbd24eefd
MD5 7fd9164f2f130182d735c32316334209
BLAKE2b-256 9b3c6c572476c42dc8337bbbe6a0e9771021f9d1c3530e0a09ac9c34abd9c737

See more details on using hashes here.

File details

Details for the file getdaft-0.2.30-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.30-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 b1395c244557d4039a95027a127f5d3826f49d56dcd850be892f0e2c179e4619
MD5 0320af921b413907d3b5545104e1c374
BLAKE2b-256 99ca2f34505947ebddf82282829e32bd037c61faf2c72e717f2eb1f8a376597f

See more details on using hashes here.

File details

Details for the file getdaft-0.2.30-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.30-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 e9c00a11922bb200a9f37bb77123f0e229cea489cc78a63ab611a25dead8eed7
MD5 c6a1622bba835873c39c98a8c41f92e7
BLAKE2b-256 490160072bb17a02a21335fdbe07b17f3ce9cc90bfdea54baffc95a81eb0a79f

See more details on using hashes here.

File details

Details for the file getdaft-0.2.30-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.30-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 82da65b6e71a2de4dae7b363222424d045aad82a19055663a38c74bf7d262cab
MD5 b3ebcc16a6b85c6311a7480291651dd7
BLAKE2b-256 9e8137a620b23d9393c3a1134dd1c0405a586976547bc5a935f241f0e58cd745

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page