Skip to main content

A simple package to time CPU/GPU/Multi-GPU ops

Project description

Torch Simple Timing

A simple yet versatile package to time CPU/GPU/Multi-GPU ops.

  1. "I want to time operations once"
    1. That's what a Clock is for
  2. "I want to time the same operations multiple times"
    1. That's what a Timer is for

In simple terms:

  • A Clock is an object (and context-manager) that will compute the ellapsed time between its start() (or __enter__) and stop() (or __exit__)
  • A Timer will internally manage clocks so that you can focus on readability and not data structures

Installation

pip install torch_simple_timing

How to use

A Clock

from torch_simple_parsing import Clock
import torch

t = torch.rand(2000, 2000)
gpu = torch.cuda.is_available()

with Clock(gpu=gpu) as context_clock:
    torch.inverse(t @ t.T)

clock = Clock(gpu=gpu).start()
torch.inverse(t @ t.T)
clock.stop()

print(context_clock.duration) # 0.29688501358032227
print(clock.duration)         # 0.292896032333374

More examples, including bout how to easily share data structures using a store can be found in the documentation.

A Timer

from torch_simple_timing import Timer
import torch

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

X = torch.rand(5000, 5000, device=device)
Y = torch.rand(5000, 100, device=device)
model = torch.nn.Linear(5000, 100).to(device)
optimizer = torch.optim.Adam(model.parameters())

gpu = device.type == "cuda"
timer = Timer(gpu=gpu)

for epoch in range(10):
    timer.mark("epoch").start()
    for b in range(50):
        x = X[b*100: (b+1)*100]
        y = Y[b*100: (b+1)*100]
        optimizer.zero_grad()
        with timer.mark("forward", ignore=epoch>0):
            p = model(x)
        loss = torch.nn.functional.cross_entropy(p, y)
        with timer.mark("backward", ignore=epoch>0):
            loss.backward()
        optimizer.step()
    timer.mark("epoch").stop()

stats = timer.stats()
# use stats for display and/or logging
# wandb.summary.update(stats)
print(timer.display(stats=stats, precision=5))
epoch    : 0.25064 ± 0.02728 (n=10)
forward  : 0.00226 ± 0.00526 (n=50)
backward : 0.00209 ± 0.00387 (n=50)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_simple_timing-0.1.0.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

torch_simple_timing-0.1.0-py3-none-any.whl (10.8 kB view details)

Uploaded Python 3

File details

Details for the file torch_simple_timing-0.1.0.tar.gz.

File metadata

  • Download URL: torch_simple_timing-0.1.0.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.9.2 Darwin/21.2.0

File hashes

Hashes for torch_simple_timing-0.1.0.tar.gz
Algorithm Hash digest
SHA256 15d35a52d190815444adaad97bdca44b479472574f9ef76b837ec7e8fa043ca1
MD5 44fc3a35549e1c237d59e61e045d11e7
BLAKE2b-256 4658f9962fd8cd9ff23d0b17c2efbb370fbc23b3e8cd832c179b7b38f52f39fe

See more details on using hashes here.

File details

Details for the file torch_simple_timing-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for torch_simple_timing-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 de3d18b97ab4f740fd1f5c0b600dd31a0f9d3f87de7dcf26717bc7925ce9ba0b
MD5 3a2177dc0ef448b9b0d6d6a0be78c9a9
BLAKE2b-256 a9b9911e42596c96b096d18e4dc7b6b6096aa88627be04bca14dd7fe924cda06

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page