Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Unified Engine for Data Analytics, Engineering & ML/AI

Daft is a distributed query engine for large-scale data processing using Python or SQL, implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration, or SQL for analytical queries

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.3.9.tar.gz (3.7 MB view details)

Uploaded Source

Built Distributions

getdaft-0.3.9-cp38-abi3-win_amd64.whl (27.0 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.3.9-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (30.0 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.3.9-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (28.5 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.3.9-cp38-abi3-macosx_11_0_arm64.whl (24.9 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.3.9-cp38-abi3-macosx_10_12_x86_64.whl (27.0 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.3.9.tar.gz.

File metadata

  • Download URL: getdaft-0.3.9.tar.gz
  • Upload date:
  • Size: 3.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.9.tar.gz
Algorithm Hash digest
SHA256 59e14728e5a74ed12e7494ead4f704d7b7e6bfe4c395fe03304c2cc43147b8b1
MD5 1f76589a92dae35e43fa996e388a494b
BLAKE2b-256 b52c9b726b5f5161469586bb1985782bb7de4d485ef3acf10dafcf58350967db

See more details on using hashes here.

File details

Details for the file getdaft-0.3.9-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.3.9-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 27.0 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.9-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 f98e24695837db18366d216267146641f2aeb2993a30e86c8808470ae23d5e88
MD5 9138a26f91a3fa156f782e4589434fdf
BLAKE2b-256 4822a9194a684e22e3f685c364463c98e4afd288d91c461226918804c1d74e78

See more details on using hashes here.

File details

Details for the file getdaft-0.3.9-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.9-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 eeaedceaed6e874a74c2274eef474307a9b7ee31babaa0b6e5ca72dab3e80b55
MD5 baadaf4493784567be7036baedb99b1d
BLAKE2b-256 ab072c99c3434d218d11d25bf80b9055e5822cd6faf919845ae3bcc42a1d7f20

See more details on using hashes here.

File details

Details for the file getdaft-0.3.9-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.9-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 904c3a0d07ee0ea5b93e586b16d33520fb9ea761f001d03959edac05fa54b218
MD5 03832f682f5b7017b5778fe044bcb3b8
BLAKE2b-256 04bc1c485ffe6ec7c9095057abcd9b90231965474b161e0128b867ed1cf0a8d2

See more details on using hashes here.

File details

Details for the file getdaft-0.3.9-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.9-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 e1d6e9288ee3dbb5335484c08904cbd8a794e764005216856f29505f82a38162
MD5 fbe17ddb259af95214694ecd02f23e2e
BLAKE2b-256 d95e919225390b28764403aa853871f32f1039e0ca95aabb3f4270b4023eed9c

See more details on using hashes here.

File details

Details for the file getdaft-0.3.9-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.9-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 f5b430bf26b1e7cf5d26ebb41b5313ff3c632f8ec3922311605689f1c34bb58b
MD5 f9f6964317a325828117ccf72ca8519f
BLAKE2b-256 e7c9dd0dfc4a1a98d8d7939472d5ea0abe17136358d0e604daec054daf4776e8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page