A simple package to time CPU/GPU/Multi-GPU ops
Project description
Torch Simple Timing
A simple yet versatile package to time CPU/GPU/Multi-GPU ops.
- "I want to time operations once"
- That's what a
Clock
is for
- That's what a
- "I want to time the same operations multiple times"
- That's what a
Timer
is for
- That's what a
In simple terms:
- A
Clock
is an object (and context-manager) that will compute the ellapsed time between itsstart()
(or__enter__
) andstop()
(or__exit__
) - A
Timer
will internally manage clocks so that you can focus on readability and not data structures
Installation
pip install torch_simple_timing
How to use
A Clock
from torch_simple_parsing import Clock
import torch
t = torch.rand(2000, 2000)
gpu = torch.cuda.is_available()
with Clock(gpu=gpu) as context_clock:
torch.inverse(t @ t.T)
clock = Clock(gpu=gpu).start()
torch.inverse(t @ t.T)
clock.stop()
print(context_clock.duration) # 0.29688501358032227
print(clock.duration) # 0.292896032333374
More examples, including bout how to easily share data structures using a store
can be found in the documentation.
A Timer
from torch_simple_timing import Timer
import torch
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
X = torch.rand(5000, 5000, device=device)
Y = torch.rand(5000, 100, device=device)
model = torch.nn.Linear(5000, 100).to(device)
optimizer = torch.optim.Adam(model.parameters())
gpu = device.type == "cuda"
timer = Timer(gpu=gpu)
for epoch in range(10):
timer.clock("epoch").start()
for b in range(50):
x = X[b*100: (b+1)*100]
y = Y[b*100: (b+1)*100]
optimizer.zero_grad()
with timer.clock("forward", ignore=epoch>0):
p = model(x)
loss = torch.nn.functional.cross_entropy(p, y)
with timer.clock("backward", ignore=epoch>0):
loss.backward()
optimizer.step()
timer.clock("epoch").stop()
stats = timer.stats()
# use stats for display and/or logging
# wandb.summary.update(stats)
print(timer.display(stats=stats, precision=5))
epoch : 0.25064 ± 0.02728 (n=10)
forward : 0.00226 ± 0.00526 (n=50)
backward : 0.00209 ± 0.00387 (n=50)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
torch_simple_timing-0.1.2.tar.gz
(10.9 kB
view details)
Built Distribution
File details
Details for the file torch_simple_timing-0.1.2.tar.gz
.
File metadata
- Download URL: torch_simple_timing-0.1.2.tar.gz
- Upload date:
- Size: 10.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.2.2 CPython/3.9.2 Darwin/21.2.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d6c5526e44f77c70225cbbbe55e3db81312610b2cefdd8055a1a1e740810e330 |
|
MD5 | 12b84e2771a3ce02feb0cd5dddea5ba2 |
|
BLAKE2b-256 | 5034d0a33738ade61b316f42c742cf5c3d2ce943987714c000ac26f8f311b637 |
File details
Details for the file torch_simple_timing-0.1.2-py3-none-any.whl
.
File metadata
- Download URL: torch_simple_timing-0.1.2-py3-none-any.whl
- Upload date:
- Size: 11.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.2.2 CPython/3.9.2 Darwin/21.2.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ae159dde33213112106933ddadefb57db54e480d4c26d6dbd96c7d3d8d12f821 |
|
MD5 | 7e43fdb9fba993b8fbddc1bb5d95f313 |
|
BLAKE2b-256 | 1b48a9346393e3bdc466c5d742dfdfafdd20460a875496434b4238f658be8da2 |