Skip to main content

Distributed Dataframes for Multimodal Data

Reason this release was yanked:

daft.context.set_runner_ray is broken, which leads to a bad experience with using Daft on Ray

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Unified Engine for Data Analytics, Engineering & ML/AI

Daft is a distributed query engine for large-scale data processing using Python or SQL, implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration, or SQL for analytical queries

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.3.12.tar.gz (3.8 MB view details)

Uploaded Source

Built Distributions

getdaft-0.3.12-cp38-abi3-win_amd64.whl (28.1 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.3.12-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (31.0 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.3.12-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (29.4 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.3.12-cp38-abi3-macosx_11_0_arm64.whl (25.8 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.3.12-cp38-abi3-macosx_10_12_x86_64.whl (27.9 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.3.12.tar.gz.

File metadata

  • Download URL: getdaft-0.3.12.tar.gz
  • Upload date:
  • Size: 3.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.12.tar.gz
Algorithm Hash digest
SHA256 af78370e4b289f3b8696c29a07ea8ad0b71e93f10cb38c3ebbc012b02156ec1d
MD5 858ad133cc940f94d15e5174cac637d0
BLAKE2b-256 6b5c17d517a698b6d7b08f5c3cd4a08f5911db9e077a3c935cd45c4e6fa396e2

See more details on using hashes here.

File details

Details for the file getdaft-0.3.12-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.3.12-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 28.1 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.12-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 d3a734e318af36959dd61a4e346cb2331a5156ec8db159daebcddfd283df25d1
MD5 979d9295c2032ba531924712a090270e
BLAKE2b-256 915a2b73c6e20b9d078ac2f554a9ab3f49700830778b155f2179ae0f2047e3a8

See more details on using hashes here.

File details

Details for the file getdaft-0.3.12-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.12-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 4206059c69b367e1e7abd36cec1c15c7d418a3179e305a19a3d2e97675f1e50c
MD5 ec534bca5e7fe6ceeeb77790d29996a5
BLAKE2b-256 4f10ac34f2fc564d1a15f41ed12e24edc610cc8075c697f44865a7c35a65f067

See more details on using hashes here.

File details

Details for the file getdaft-0.3.12-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.12-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 31b08a490cde557d7c92e9505de0c5ad30076006d60b80033ccb83488a530ed6
MD5 56a1c0c9d7256f6e220ebca23bb1cb53
BLAKE2b-256 ba4e5ce6f24b092b78153ed7d7f12a597a4d10c72f74e9a30ff68702e1acc2a9

See more details on using hashes here.

File details

Details for the file getdaft-0.3.12-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.12-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 7a3e1f35aacbd9696312381442b4b2d4f7fc8e1a48918566a9b95c866ddb61b2
MD5 4522448c7e420f393056241511e6c0fe
BLAKE2b-256 60b7931aab36fa1794582012828fd3708e408bce8e87a9c7fff7539b0895082e

See more details on using hashes here.

File details

Details for the file getdaft-0.3.12-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.12-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 21cc1caa6f3d05bc36924726a1eb5153b8132152faf20acde1fb4a71a0c90687
MD5 8ae0681b5ade70bf9dec218cf34c4ed4
BLAKE2b-256 94e0be4e46dcbfef5dfbe4d8bf36bde997c6f36d0692da8157c02ccc1afc9715

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page