Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Unified Engine for Data Analytics, Engineering & ML/AI

Daft is a distributed query engine for large-scale data processing using Python or SQL, implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration, or SQL for analytical queries

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.3.10.tar.gz (3.8 MB view details)

Uploaded Source

Built Distributions

getdaft-0.3.10-cp38-abi3-win_amd64.whl (27.5 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.3.10-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (30.4 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.3.10-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (28.9 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.3.10-cp38-abi3-macosx_11_0_arm64.whl (25.2 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.3.10-cp38-abi3-macosx_10_12_x86_64.whl (27.4 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.3.10.tar.gz.

File metadata

  • Download URL: getdaft-0.3.10.tar.gz
  • Upload date:
  • Size: 3.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.10.tar.gz
Algorithm Hash digest
SHA256 b45d1d3a82cfec0a187f52702a8cec7c63a5b805e8998625c07d74ef041ef39f
MD5 0ad0a67e123171c6eb7cc792ef0595ef
BLAKE2b-256 9bb78b61c877e3444510f0dffb95ca3636eeefde756bdaf489d95d675be22d0f

See more details on using hashes here.

File details

Details for the file getdaft-0.3.10-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.3.10-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 27.5 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.10-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 2e541009d0fe9e98ef2aef7d5cbe5d05c463b54b7239d32807639f2a5434b502
MD5 6e07d94ca4f65afaa58df4ebbdaef91b
BLAKE2b-256 c5ffb253d9bd8f3f1b74e82fe5b70a4957cd1bc87603d5ec9306820f4b743ff7

See more details on using hashes here.

File details

Details for the file getdaft-0.3.10-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.10-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 0eaf2b7b116cf9c9c5bc466fe5c3d5d3a689b96fe51cc0ae1b1de224b1e59e6b
MD5 060547f8ed61566f2a20c3e7d7e63ac1
BLAKE2b-256 83011d08013e7372fa85923516ebadfecce8ec2f6eddf99f8460fee0e9c2614d

See more details on using hashes here.

File details

Details for the file getdaft-0.3.10-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.10-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 8833187719e8963c3b53a78af648c85c2b1a39e383a8bddadbc98d5f4072b502
MD5 a5873be74c3499cc27c4f2b1d589fd14
BLAKE2b-256 21f83cf4408f5ffb386faae723d62a7485ee78fd8366d3f302e501672ef61be0

See more details on using hashes here.

File details

Details for the file getdaft-0.3.10-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.10-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1175b7c12e033332b7dfe5a329fa95c414656e7c3478eca9a5b511f9e562d93f
MD5 c9de6c4f7a5ea14bf240b7729ef1235c
BLAKE2b-256 981eb6c881ea00f51c06540abf2b21f40a0aa3e0ec2912060f33e01a6b4e1d34

See more details on using hashes here.

File details

Details for the file getdaft-0.3.10-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.10-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 06ce182bcf4b1f15b623152f129c9b8b120007f8f7ed0b35d194b608baa8a8f1
MD5 622e00b0774ed4a00d8e975039c33e4b
BLAKE2b-256 9a7be9f15057b5a03c851a6de2d14ed57cf539164de013af4dda5429860aa663

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page