Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.2.25.tar.gz (2.6 MB view details)

Uploaded Source

Built Distributions

getdaft-0.2.25-cp38-abi3-win_amd64.whl (18.1 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.2.25-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (20.4 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.2.25-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (19.5 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.2.25-cp38-abi3-macosx_11_0_arm64.whl (16.4 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.2.25-cp38-abi3-macosx_10_12_x86_64.whl (17.8 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.2.25.tar.gz.

File metadata

  • Download URL: getdaft-0.2.25.tar.gz
  • Upload date:
  • Size: 2.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.25.tar.gz
Algorithm Hash digest
SHA256 60b2ca7d39447ba4b19eab6ccfd6fc706914ecf43d0080a13c832b013dda589b
MD5 e22d472085de05167a9d0d2f6a31b5bd
BLAKE2b-256 4b6cb2b93ce8c1c5a2c923fa907e9108ff9cd2364ae813b80aa51e29dcbc9040

See more details on using hashes here.

File details

Details for the file getdaft-0.2.25-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.2.25-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 18.1 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.9

File hashes

Hashes for getdaft-0.2.25-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 fbb3437e666478d06e661d961e5fd10b8cc33385bd2bafafcd22daf403fe6df1
MD5 0b06e1f4496c7e896240cfda11eb46ca
BLAKE2b-256 89367cf9d48ed32587b17c284d66b4ba077987ac3393733f6d059692354b4b2b

See more details on using hashes here.

File details

Details for the file getdaft-0.2.25-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.25-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 1b86a42e7310de613a0fb30d68a70ee0678e6605023e48a3c1dd28f8752d380e
MD5 b57edb083029fff452a58f38b1054e19
BLAKE2b-256 35eb0340a3ef19dd6f04ddddc21080a48513d151aae3cc2cbd3887f1cad16513

See more details on using hashes here.

File details

Details for the file getdaft-0.2.25-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.25-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 2cfeef90e2f446f65e0e7292431e5354995fe693cf9bbbd434dafd4b8971ea83
MD5 ec5665a60a76181551ebb49d7783982c
BLAKE2b-256 0f1d7efa6ccccff053915e4042d54b6b68f2d6013645008757d319fd437113ad

See more details on using hashes here.

File details

Details for the file getdaft-0.2.25-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.25-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 12a95f0ce9206c77a439ace0dc705d13acbe0e8278907ad2e57f62e0c01330ad
MD5 afea464f3a5e878ed1063e529795d930
BLAKE2b-256 6b371e603d1c6384dad386d82ab6d13976376310c14774b3a4d5fed99f6e6828

See more details on using hashes here.

File details

Details for the file getdaft-0.2.25-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.2.25-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 7aab5bdf4af6b9bb0f7e0555cd36762d57da97ed026017f3a4b00f97bf5bf7f1
MD5 92b02390d65f257314de8b3a377f721d
BLAKE2b-256 a928b28b5a05f6e3bb81f36d5a5555f378196a1a23ce733fe55c7ee3bb57f128

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page