Skip to main content

A Distributed DataFrame library for large scale complex data processing.

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: the distributed Python dataframe for complex data

Daft is a fast, Pythonic and scalable open-source dataframe library built for Python and Machine Learning workloads.

Daft is currently in its Beta release phase - please expect bugs and rapid improvements to the project. We welcome user feedback/feature requests in our Discussions forums

Table of Contents

About Daft

The Daft dataframe is a table of data with rows and columns. Columns can contain any Python objects, which allows Daft to support rich complex data types such as images, audio, video and more.

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex multimodal data such as Images, Embeddings and Python objects. Ingestion and basic transformations of complex data is extremely easy and performant in Daft.

  2. Notebook Computing: Daft is built for the interactive developer experience on a notebook - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Rich complex formats such as images can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.1.20.tar.gz (752.1 kB view details)

Uploaded Source

Built Distributions

getdaft-0.1.20-cp37-abi3-win_amd64.whl (14.6 MB view details)

Uploaded CPython 3.7+ Windows x86-64

getdaft-0.1.20-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (19.0 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

getdaft-0.1.20-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (17.4 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ ARM64

getdaft-0.1.20-cp37-abi3-macosx_11_0_arm64.whl (13.8 MB view details)

Uploaded CPython 3.7+ macOS 11.0+ ARM64

getdaft-0.1.20-cp37-abi3-macosx_10_7_x86_64.whl (15.2 MB view details)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

File details

Details for the file getdaft-0.1.20.tar.gz.

File metadata

  • Download URL: getdaft-0.1.20.tar.gz
  • Upload date:
  • Size: 752.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for getdaft-0.1.20.tar.gz
Algorithm Hash digest
SHA256 bc99db830fdfb094a6bfd3bd6eb85819f3c15b0ee717ce2b739d8d151212fa9a
MD5 837a36793ce0069b85673a9ddcfaecce
BLAKE2b-256 f6acd2d8ca19cfef43c27d79fa1264dc81ce46ce2a4be826edf550f6fee4b3e1

See more details on using hashes here.

File details

Details for the file getdaft-0.1.20-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.1.20-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 14.6 MB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for getdaft-0.1.20-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 db6d9f80c31934bbca3d41a81d8b8facdd57eb03951c9d1f145b311f36a504b4
MD5 bb76b8229b6a497457e951f0d4891b81
BLAKE2b-256 b09a04f4c764a1acf7b3138b97e6b78e181efddd0023eeab0b15cf0ad4c2603b

See more details on using hashes here.

File details

Details for the file getdaft-0.1.20-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.20-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f3b5d9f47bdaf214d4aacaba039b10fc33fcd23411e468b6f0579c677e57cbd3
MD5 ce2eb48ee942b465f0a5c338ed8e0c31
BLAKE2b-256 57a5ff79673e70aa53ac3606ae114acb91e3906d9f8f43ed4b2be3151228a767

See more details on using hashes here.

File details

Details for the file getdaft-0.1.20-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.20-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 31b9c9d559d7e578e8050ef5da16d8bb06a4a4f55fce7f26c5099137210f00ed
MD5 dc60a2c3c3c2a96ad5c6453c8354543d
BLAKE2b-256 38c31705ded16149ba045962fd4e4604f12d4e64729107f07a13bd735b4e76cc

See more details on using hashes here.

File details

Details for the file getdaft-0.1.20-cp37-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.20-cp37-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 c53a6a7f37d455d0f19dd7b7251bff25274cff313415a7e32fc89cffd3c9a446
MD5 e67c6e731314483f27c059cf899c20db
BLAKE2b-256 9e41b51cdd2078f0a3d930f882f9c7d281e2ac43c524eb23f6577c7c308d39e8

See more details on using hashes here.

File details

Details for the file getdaft-0.1.20-cp37-abi3-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.1.20-cp37-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 507c1652f7f1a5964bcc08a9ad3386f52389131fad143921b29a8a3870821353
MD5 f5e41bcde293ad4b434e317364dea746
BLAKE2b-256 83c14814c6d20eccd7c3ab71fa6a0ed24edfd88e49ec4b0d64c76e64e85dd350

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page