Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.3.3.tar.gz (3.6 MB view details)

Uploaded Source

Built Distributions

getdaft-0.3.3-cp38-abi3-win_amd64.whl (27.6 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.3.3-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (30.2 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.3.3-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (28.8 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.3.3-cp38-abi3-macosx_11_0_arm64.whl (25.1 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.3.3-cp38-abi3-macosx_10_12_x86_64.whl (27.2 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.3.3.tar.gz.

File metadata

  • Download URL: getdaft-0.3.3.tar.gz
  • Upload date:
  • Size: 3.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.3.tar.gz
Algorithm Hash digest
SHA256 4d8d22ad7f5514bb90841481e3ae04698ebd07b912af421b1af225690b2ddd05
MD5 6983698e5c6f198397aa39958da27f00
BLAKE2b-256 26a4b0d110cb8fe2c791d6908c4757aafbe5e3ce04c1f895f87b972272990bfc

See more details on using hashes here.

File details

Details for the file getdaft-0.3.3-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.3.3-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 27.6 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.3-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 03fa7dced5598a5eeb61d7dadb764da9f103cc92c108b3333682b00ed8903800
MD5 fb1f249aedfc80dcef10e462fd92ff70
BLAKE2b-256 d5e366d5873645f054862e20630399540b1865b1effca388cee9ae19a9f69768

See more details on using hashes here.

File details

Details for the file getdaft-0.3.3-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.3-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 0bf5bb4e66868ef5035a6e393c43d90077cb61850a7d69ed4afea7c5027214b8
MD5 b6f81aa48a8c890e4b9861af1db63166
BLAKE2b-256 b02802274dc1ceb59098b13d45afcef99181f33026ea2da83e487b43593c07a7

See more details on using hashes here.

File details

Details for the file getdaft-0.3.3-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.3-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 e1b506dabb985dc757887405ceb18f3125e948f888f706f7e47c9f7c1a307877
MD5 c9bfe51761ef0256395beca7b0ed64f4
BLAKE2b-256 1317992dc95bd189be92a1354521dacc477edeeba23f56d2dcd35beb332f42b0

See more details on using hashes here.

File details

Details for the file getdaft-0.3.3-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.3-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 2ebd7ea44d0c9b9defd5d7070d02e1971ad22241276ef8f711c0d396db303ea9
MD5 7d24f0c3ac6487924d6bc1bf727139f7
BLAKE2b-256 310744de7aef05181b85799eae75f4f5b7cee28eebd9eb22c3913ed1163714bb

See more details on using hashes here.

File details

Details for the file getdaft-0.3.3-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.3-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 b1217f3b8200b9ebfbc0997b0c8369c19a581048c4cb42058175f620f0b13598
MD5 543ec8f6d607938693830da90ec9a322
BLAKE2b-256 7221b7f1d3c4147751be8fd0c1b8b07e080baa14e6cfc60596e3d3655474af21

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page