π ReferenceβοΈ
This page covers the launcher details and a complete runnable example with metric tracking. For installation, the API cheat sheet, and getting started, see the Quick Start.
π Scheduler-Aware Launcher: ezpz launchβοΈ
For complete CLI usage, flags, and the sequence diagram, see the
ezpz launch CLI reference.
-
Scheduler smarts: detects PBS / Slurm automatically!
ezpz launchwill, by default, determine the appropriate launcher based on the detected job scheduler environment.- Sensible Fallback: Sensible fallback to
mpirun -npwhen running / testing locally
- Sensible Fallback: Sensible fallback to
-
Flexible resource specification:
-np,-ppn,--nhosts,--hostfile, etc. Including the ability to pass custom resource flags like-np,--nhosts,--hostfile, and other scheduler-specific options. -
Pass-through arguments: Pass any additional flags through to the underlying launcher. For launcher-only flags/env (e.g.,
-x FOO=bar), place them before--; everything after--is the command to run:Launcher Examples
To pass arguments through to the launcher1
$ ezpz launch -- python3 -m ezpz.examples.fsdp # pass --line-buffer through to mpiexec: $ ezpz launch --line-buffer -- python3 \ -m ezpz.examples.vit --compile --fsdp # Create and use a custom hostfile $ head -n 2 "${PBS_NODEFILE}" > hostfile0-2 $ ezpz launch --hostfile hostfile0-2 -- python3 \ -m ezpz.examples.fsdp_tp # use explicit np/ppn/nhosts $ ezpz launch \ -np 4 \ -ppn 2 \ --nhosts 2 \ --hostfile hostfile0-2 \ -- \ python3 -m ezpz.examples.diffusion # forward the PYTHONPATH environment variable $ ezpz launch -x PYTHONPATH=/tmp/.venv/bin:${PYTHONPATH} \ -- \ python3 -m ezpz.examples.fsdp
For the API cheat sheet (before/after diffs for setup, device management, model wrapping, training loop, and metric tracking), see the Quick Start.
β Complete Example with HistoryβοΈ
Capture metrics across all ranks, persist JSONL, generate text/PNG plots, and
(when configured) log to Weights & Biasesβno extra code on worker ranks.
The History class aggregates distributed statistics (min/max/mean/std) and
produces terminal-friendly plots automatically via finalize().
import ezpz
import torch
from ezpz.models.minimal import SequentialLinearNet # multi-layer Linear+ReLU network
import time
logger = ezpz.get_logger(__name__)
rank = ezpz.setup_torch()
device = ezpz.get_torch_device()
model = SequentialLinearNet(
input_dim=16,
output_dim=32,
sizes=[4, 8, 12]
)
model.to(device)
optimizer = torch.optim.AdamW(model.parameters())
history = ezpz.History()
for i in range(10):
t0 = time.perf_counter()
batch = torch.randn(1, 16)
batch = batch.to(device)
output = model(batch)
pred = torch.randn(output.shape)
loss = ((output - pred.to(device)) ** 2).sum()
loss.backward()
optimizer.step()
logger.info(
history.update(
{
"iter": i,
"loss": loss,
"dt": time.perf_counter() - t0,
}
)
)
if rank == 0:
history.finalize()
ezpz.cleanup()
Swap in your own model
SequentialLinearNet is a small multi-layer Linear+ReLU network included
for demonstration. Replace it with any torch.nn.Module β the rest of
the script (setup, wrapping, training loop, history) stays the same.
πͺ΅ Logs
Single Process
Launching in a single process via python:
> python3 example.py
[2026-01-15 16:29:59,463919][I][ezpz/dist:1451:setup_torch_distributed] Using device=mps with backend=gloo
[2026-01-15 16:29:59,475974][I][ezpz/dist:1316:setup_torch_DDP] Caught MASTER_PORT=61496 from environment!
[2026-01-15 16:29:59,477538][I][ezpz/dist:1332:setup_torch_DDP] Using torch.distributed.init_process_group with
- master_addr='Sams-MacBook-Pro-2.local'
- master_port='61496'
- world_size=1
- rank=0
- local_rank=0
- timeout=datetime.timedelta(seconds=3600)
- backend='gloo'
[2026-01-15 16:29:59,478263][I][ezpz/dist:964:init_process_group] Calling torch.distributed.init_process_group_with: rank=0 world_size=1 backend=gloo
[2026-01-15 16:29:59,789459][I][ezpz/dist:1699:setup_torch] Using device='mps' with backend='gloo' + 'gloo' for distributed training.
[2026-01-15 16:29:59,872685][W][ezpz/dist:502:print_dist_setup] Using [1 / 1] available "mps" devices !!
[2026-01-15 16:29:59,873382][I][ezpz/dist:1746:setup_torch] ['Sams-MacBook-Pro-2.local'][device='mps'][node=0/0][rank=0/0][local_rank=0/0]
[2026-01-15 16:30:01,875023][I][ezpz/history:214:init] Not using distributed metrics! Will only be tracked from a single rank...
[2026-01-15 16:30:01,875595][I][ezpz/history:220:init] Using History with distributed_history=False
[2026-01-15 16:30:02,316946][I][ezpz/example:30:<module>] iter=0 loss=31.003010 dt=0.435792
[2026-01-15 16:30:02,330593][I][ezpz/example:30:<module>] iter=1 loss=57.543598 dt=0.008874
[2026-01-15 16:30:02,337684][I][ezpz/example:30:<module>] iter=2 loss=28.547897 dt=0.003079
[2026-01-15 16:30:02,346325][I][ezpz/example:30:<module>] iter=3 loss=22.243866 dt=0.002852
[2026-01-15 16:30:02,353276][I][ezpz/example:30:<module>] iter=4 loss=25.085716 dt=0.003102
[2026-01-15 16:30:02,359662][I][ezpz/example:30:<module>] iter=5 loss=27.327484 dt=0.002849
[2026-01-15 16:30:02,364890][I][ezpz/example:30:<module>] iter=6 loss=19.950121 dt=0.003308
[2026-01-15 16:30:02,371596][I][ezpz/example:30:<module>] iter=7 loss=36.892731 dt=0.005253
[2026-01-15 16:30:02,378344][I][ezpz/example:30:<module>] iter=8 loss=28.500504 dt=0.002372
[2026-01-15 16:30:02,384270][I][ezpz/example:30:<module>] iter=9 loss=33.020760 dt=0.002239
/Users/samforeman/vibes/saforem2/ezpz/src/ezpz/history.py:2223: UserWarning: Converting a tensor with requires_grad=True to a scalar may lead to unexpected behavior.
Consider using tensor.detach() first. (Triggered internally at /Users/runner/work/pytorch/pytorch/pytorch/torch/csrc/autograd/generated/python_variable_methods.cpp:837.)
x = torch.Tensor(x).numpy(force=True)
[2026-01-15 16:30:02,458225][I][ezpz/history:2385:finalize] Saving plots to /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/plots/mplot (matplotlib) and /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/plots/tplot (tplot)
[2026-01-15 16:30:03,822720][I][ezpz/tplot:321:tplot] Using plot type: line
[2026-01-15 16:30:03,823148][I][ezpz/tplot:323:tplot] Using plot marker: hd
dt vs iter
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
0.436β€β β
ββ β
0.364β€ββ β
β β β
β β β
0.291β€ β β
β β β
0.219β€ ββ β
β β β
0.147β€ β β
β β β
β β β
0.074β€ ββ β
β β β
0.002β€ βββββββββββββββββββββββββββββββββββββββββββββββββ
ββ¬ββββββ¬ββββββ¬βββββ¬ββββββ¬ββββββ¬ββββββ¬βββββ¬ββββββ¬βββββββ
0 1 2 3 4 5 6 7 8 9
dt iter
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/plots/tplot/dt.txt
[2026-01-15 16:30:03,827907][I][ezpz/tplot:321:tplot] Using plot type: hist
[2026-01-15 16:30:03,828187][I][ezpz/tplot:323:tplot] Using plot marker: hd
freq vs dt
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
9.0β€βββββ β
ββββββ β
7.5β€βββββ β
ββββββ β
ββββββ β
6.0β€βββββ β
ββββββ β
4.5β€βββββ β
ββββββ β
3.0β€βββββ β
ββββββ β
ββββββ β
1.5β€βββββ ββββββ
ββββββ ββββββ
0.0β€βββββ ββββββ
ββ¬ββββββββββββββ¬βββββββββββββ¬ββββββββββββββ¬βββββββββββββ¬β
-0.02 0.10 0.22 0.34 0.46
freq dt
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/plots/tplot/dt-hist.txt
[2026-01-15 16:30:03,833010][I][ezpz/tplot:321:tplot] Using plot type: line
[2026-01-15 16:30:03,833296][I][ezpz/tplot:323:tplot] Using plot marker: hd
loss vs iter
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
57.5β€ ββ β
β ββ β
51.3β€ β β β
β ββ β β
β β β β
45.0β€ ββ ββ β
β β β β
38.7β€ β ββ β
βββ β ββ β
32.5β€β ββ β ββ ββ
ββ β β ββ ββββββ
β ββ β β βββ β
26.2β€ βββ ββββββββ β β
β ββββββββ ββ β β
20.0β€ ββ β
ββ¬ββββββ¬ββββββ¬ββββββ¬ββββββ¬βββββ¬ββββββ¬ββββββ¬ββββββ¬βββββββ
1 2 3 4 5 6 7 8 9
loss iter
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/plots/tplot/loss.txt
[2026-01-15 16:30:03,837141][W][ezpz/history:2420:finalize] h5py not found! Saving dataset as netCDF instead.
[2026-01-15 16:30:03,837503][I][utils/init:636:save_dataset] Saving dataset to: /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/dataset_dataset.nc
[2026-01-15 16:30:03,885343][I][ezpz/history:2433:finalize] Saving history report to /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/report.md
>
ezpz launch
Launching via ezpz launch (fallback with 2 processes on MacBookPro):
> ezpz launch python3 /tmp/test.py
[2026-01-15 16:25:45,611138][I][ezpz/launch:515:run] No active scheduler detected; falling back to local mpirun: mpirun -np 2 python3 /tmp/test.py
[2026-01-15 16:25:47,138854][I][ezpz/dist:1451:setup_torch_distributed] Using device=mps with backend=gloo
[2026-01-15 16:25:47,149140][I][ezpz/dist:1316:setup_torch_DDP] Caught MASTER_PORT=60839 from environment!
[2026-01-15 16:25:47,150476][I][ezpz/dist:1332:setup_torch_DDP] Using torch.distributed.init_process_group with
- master_addr='Sams-MacBook-Pro-2.local'
- master_port='60839'
- world_size=2
- rank=0
- local_rank=0
- timeout=datetime.timedelta(seconds=3600)
- backend='gloo'
[2026-01-15 16:25:47,151050][I][ezpz/dist:964:init_process_group] Calling torch.distributed.init_process_group_with: rank=0 world_size=2 backend=gloo
[2026-01-15 16:25:47,242104][I][ezpz/dist:1699:setup_torch] Using device='mps' with backend='gloo' + 'gloo' for distributed training.
[2026-01-15 16:25:47,261869][I][ezpz/dist:1746:setup_torch] ['Sams-MacBook-Pro-2.local'][device='mps'][node=0/0][rank=1/1][local_rank=1/1]
[2026-01-15 16:25:47,289930][W][ezpz/dist:502:print_dist_setup] Using [2 / 2] available "mps" devices !!
[2026-01-15 16:25:47,290348][I][ezpz/dist:1746:setup_torch] ['Sams-MacBook-Pro-2.local'][device='mps'][node=0/0][rank=0/1][local_rank=0/1]
[2026-01-15 16:25:48,882995][I][ezpz/history:220:init] Using History with distributed_history=True
[2026-01-15 16:25:49,293872][I][tmp/test:30:<module>] iter=0 loss=14.438349 dt=0.383613 loss/mean=18.930481 loss/max=23.422613 loss/min=14.438349 loss/std=4.492133 dt/mean=0.383651 dt/max=0.383690 dt/min=0.383613 dt/std=0.000000
[2026-01-15 16:25:49,310545][I][tmp/test:30:<module>] iter=1 loss=38.289841 dt=0.006327 loss/mean=37.768768 loss/max=38.289841 loss/min=37.247700 loss/std=0.521159 dt/mean=0.006445 dt/max=0.006563 dt/min=0.006327 dt/std=0.000118
[2026-01-15 16:25:49,323389][I][tmp/test:30:<module>] iter=2 loss=15.649942 dt=0.003752 loss/mean=26.894470 loss/max=38.138996 loss/min=15.649942 loss/std=11.244525 dt/mean=0.003934 dt/max=0.004116 dt/min=0.003752 dt/std=0.000182
[2026-01-15 16:25:49,335400][I][tmp/test:30:<module>] iter=3 loss=21.518583 dt=0.006340 loss/mean=38.892834 loss/max=56.267082 loss/min=21.518583 loss/std=17.374252 dt/mean=0.006604 dt/max=0.006869 dt/min=0.006340 dt/std=0.000264
[2026-01-15 16:25:49,343467][I][tmp/test:30:<module>] iter=4 loss=43.398060 dt=0.003205 loss/mean=41.371902 loss/max=43.398060 loss/min=39.345749 loss/std=2.026196 dt/mean=0.002617 dt/max=0.003205 dt/min=0.002029 dt/std=0.000588
[2026-01-15 16:25:49,351912][I][tmp/test:30:<module>] iter=5 loss=43.348061 dt=0.002345 loss/mean=39.714069 loss/max=43.348061 loss/min=36.080078 loss/std=3.633997 dt/mean=0.002180 dt/max=0.002345 dt/min=0.002014 dt/std=0.000166
[2026-01-15 16:25:49,360378][I][tmp/test:30:<module>] iter=6 loss=40.937546 dt=0.003073 loss/mean=36.756641 loss/max=40.937546 loss/min=32.575737 loss/std=4.180907 dt/mean=0.002433 dt/max=0.003073 dt/min=0.001794 dt/std=0.000640
[2026-01-15 16:25:49,368605][I][tmp/test:30:<module>] iter=7 loss=30.643730 dt=0.002785 loss/mean=32.207088 loss/max=33.770447 loss/min=30.643730 loss/std=1.563398 dt/mean=0.002315 dt/max=0.002785 dt/min=0.001844 dt/std=0.000470
[2026-01-15 16:25:49,377235][I][tmp/test:30:<module>] iter=8 loss=26.110786 dt=0.003046 loss/mean=33.217815 loss/max=40.324844 loss/min=26.110786 loss/std=7.107031 dt/mean=0.002361 dt/max=0.003046 dt/min=0.001676 dt/std=0.000685
[2026-01-15 16:25:49,384409][I][tmp/test:30:<module>] iter=9 loss=22.861826 dt=0.001886 loss/mean=25.471987 loss/max=28.082148 loss/min=22.861826 loss/std=2.610158 dt/mean=0.002179 dt/max=0.002472 dt/min=0.001886 dt/std=0.000293
/Users/samforeman/vibes/saforem2/ezpz/src/ezpz/history.py:2223: UserWarning: Converting a tensor with requires_grad=True to a scalar may lead to unexpected behavior.
Consider using tensor.detach() first. (Triggered internally at /Users/runner/work/pytorch/pytorch/pytorch/torch/csrc/autograd/generated/python_variable_methods.cpp:837.)
x = torch.Tensor(x).numpy(force=True)
[2026-01-15 16:25:49,455888][I][ezpz/history:2385:finalize] Saving plots to /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/mplot (matplotlib) and /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot (tplot)
dt dt/min
βββββββββββββββββββββββββββββββββββ βββββββββββββββββββββββββββββββββββ
0.384β€β β0.384β€- β
0.320β€β β0.129β€ --------------------------------β
0.256β€ β β ββ¬ββββββββ¬ββββββββ¬ββββββββ¬ββββββββ¬β
0.129β€ ββ β 1.0 3.2 5.5 7.8 10.0
0.066β€ β βdt/min iter
0.002β€ βββββββββββββββββββββββββββββββ dt/std
ββ¬ββββββββ¬ββββββββ¬ββββββββ¬ββββββββ¬β βββββββββββββββββββββββββββββββββ
1.0 3.2 5.5 7.8 10.0 0.00068β€ * * * β
dt iter 0.00046β€ ****** ** * ****** ***β
dt/mean 0.00011β€******* *** β
βββββββββββββββββββββββββββββββββββ ββ¬ββββββββ¬βββββββ¬ββββββββ¬βββββββ¬β
0.384β€Β· β 1.0 3.2 5.5 7.8 10.0
0.320β€Β· βdt/std iter
0.256β€ Β· β dt/max
0.193β€ Β· β βββββββββββββββββββββββββββββββββββ
0.129β€ Β· β0.384β€+ β
0.066β€ Β· β0.257β€ ++ β
0.002β€ Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·β0.066β€ ++++++++++++++++++++++++++++++β
ββ¬ββββββββ¬ββββββββ¬ββββββββ¬ββββββββ¬β ββ¬ββββββββ¬ββββββββ¬ββββββββ¬ββββββββ¬β
1.0 3.2 5.5 7.8 10.0 1.0 3.2 5.5 7.8 10.0
dt/mean iter dt/max iter
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/dt.txt
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
0.384β€ ++ dt/max β
β -- dt/min β
β Β·Β· dt/mean β
0.320β€ ββ dt β
β β β
β β β
0.256β€ β β
β ββ β
β β β
0.193β€ β β
β β β
β β β
β ββ β
0.129β€ β β
β β β
β β β
0.065β€ β β
β ββ β
β β β
0.002β€ βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
ββ¬ββββββββββββββββββ¬ββββββββββββββββββ¬ββββββββββββββββββ¬ββββββββββββββββββ¬β
1.0 3.2 5.5 7.8 10.0
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/dt_summary.txt
dt/mean hist dt/max hist
βββββββββββββββββββββββββββββββββββββ βββββββββββββββββββββββββββββββββββββ
9.0β€ββββ β9.0β€ββββ β
7.5β€ββββ β7.5β€ββββ β
6.0β€ββββ β6.0β€ββββ β
4.5β€ββββ β4.5β€ββββ β
3.0β€ββββ β3.0β€ββββ β
1.5β€ββββ βββββ1.5β€ββββ βββββ
0.0β€βββ βββββ0.0β€βββ βββββ
ββ¬βββββββββ¬ββββββββ¬βββββββββ¬ββββββββ¬β ββ¬βββββββββ¬ββββββββ¬βββββββββ¬ββββββββ¬β
-0.01 0.09 0.19 0.30 0.40 -0.01 0.09 0.19 0.30 0.40
dt/min hist dt/std hist
βββββββββββββββββββββββββββββββββββββ ββββββββββββββββββββββββββββββββββββ
9.0β€ββββ β2.00β€ βββ βββββ
7.5β€ββββ β1.67β€ βββ βββββ
6.0β€ββββ β1.33β€ βββ βββββ
4.5β€ββββ β1.00β€βββββββββββββββββ ββββ ββββββββ
βββββ β ββββββββββββββββββ ββββ ββββββββ
3.0β€ββββ β0.67β€βββββββββββββββββ ββββ ββββββββ
1.5β€ββββ βββββ0.33β€βββββββββββββββββ ββββ ββββββββ
0.0β€βββ βββββ0.00β€βββββββββββββββββ ββββ ββββββββ
ββ¬βββββββββ¬ββββββββ¬βββββββββ¬ββββββββ¬β ββ¬ββββββββ¬βββββββββ¬ββββββββ¬βββββββββ
-0.02 0.09 0.19 0.30 0.40 -0.00003 0.00016 0.00034 0.00053
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/dt_hist.txt
loss loss/min
ββββββββββββββββββββββββββββββββββββ ββββββββββββββββββββββββββββββββββββ
43.4β€ βββββββββ β39.3β€ - ------------ β
38.6β€ β ββ ββ β22.7β€---- ---------- -------β
33.7β€ β β ββ βββ β ββ¬ββββββββ¬βββββββββ¬ββββββββ¬ββββββββ¬β
24.1β€ β β ββ βββββ β 1.0 3.2 5.5 7.8 10.0
19.3β€ββ β ββ βββββloss/min iter
14.4β€β βββ β loss/std
ββ¬ββββββββ¬βββββββββ¬ββββββββ¬ββββββββ¬β ββββββββββββββββββββββββββββββββββββ
1.0 3.2 5.5 7.8 10.0 17.4β€ * β
loss iter 11.8β€ **** ** * β
loss/mean 3.3β€******* *************** ****β
ββββββββββββββββββββββββββββββββββββ ββ¬ββββββββ¬βββββββββ¬ββββββββ¬ββββββββ¬β
41.4β€ Β·Β·Β·Β· β 1.0 3.2 5.5 7.8 10.0
37.6β€ Β· Β·Β·Β·Β· Β·Β·Β·Β· βloss/std iter
33.9β€ Β· Β· Β· Β·Β·Β·Β·Β·Β·Β· β loss/max
30.2β€ Β· Β· Β· Β·Β· β ββββββββββββββββββββββββββββββββββββ
26.4β€ Β· Β·Β· Β·Β·β56.3β€ + β
22.7β€Β· β45.3β€ +++++++ ++++++++++++++++++ β
18.9β€Β· β28.9β€++++ ++++β
ββ¬ββββββββ¬βββββββββ¬ββββββββ¬ββββββββ¬β ββ¬ββββββββ¬βββββββββ¬ββββββββ¬ββββββββ¬β
1.0 3.2 5.5 7.8 10.0 1.0 3.2 5.5 7.8 10.0
loss/mean iter loss/max iter
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/loss.txt
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
56.3β€ ++ loss/max + β
β -- loss/min + + β
β Β·Β· loss/mean + + β
49.3β€ ββ loss + ++ β
β + + β
β + + β
42.3β€ + +ββββββββββββββ β
β + βΒ· +++βββββ + β
β β++++++++ Β·Β·Β·Β·Β·Β·βΒ·-Β·Β·Β·Β·Β·Β·Β·Β·Β· ββ ++ + β
35.4β€ ββΒ· Β· ββ- ---------Β·Β·Β·Β·Β·Β·Β·Β· βββ+ +++ + β
β β--βΒ·Β· Β·Β· ββ- ---- Β·Β·Β·ββ+++++ Β· ++ β
β ββ -β Β· Β· β- ---- Β·ββΒ·Β·Β·Β·Β·Β·Β·Β· Β·Β· + β
β Β·β ββ Β·Β· Β·Β· β- ------ββββ Β·Β· + β
28.4β€ Β·β ββ Β·Β·Β· β- -βββββ Β·Β·++β
β Β·ββ β ββ -ββββββ Β·Β·β
β+Β·ββ β ββ --βββββββ
21.4β€Β· β β ββ β
βΒ·β ββ ββββ β
βββ ββ-βββ β
14.4β€β ββ β
ββ¬ββββββββββββββββββ¬βββββββββββββββββββ¬ββββββββββββββββββ¬ββββββββββββββββββ¬β
1.0 3.2 5.5 7.8 10.0
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/loss_summary.txt
loss/mean hist loss/max hist
ββββββββββββββββββββββββββββββββββββ ββββββββββββββββββββββββββββββββββββ
2.00β€ ββββββββ2.00β€ βββββββββββ β
1.67β€ ββββββββ1.67β€ βββββββββββ β
1.33β€ ββββββββ1.33β€ βββββββββββ β
1.00β€ββββ βββββββ ββββββββββββββββββ1.00β€βββββββ ββββββββββββββ βββββ
0.67β€ββββ βββββββ ββββββββββββββββββ0.67β€βββββββ ββββββββββββββ βββββ
0.33β€ββββ βββββββ ββββββββββββββββββ0.33β€βββββββ ββββββββββββββ βββββ
0.00β€βββ ββββββ ββββββββββββββββββ0.00β€βββββββ ββββββββββββββ βββββ
ββ¬ββββββββ¬βββββββββ¬ββββββββ¬ββββββββ¬β ββ¬ββββββββ¬βββββββββ¬ββββββββ¬ββββββββ¬β
17.9 24.0 30.2 36.3 42.4 22.0 30.9 39.8 48.8 57.7
loss/min hist loss/std hist
ββββββββββββββββββββββββββββββββββββ ββββββββββββββββββββββββββββββββββββ
2.00β€ββββ βββββ3.00β€ββββ β
1.67β€ββββ βββββ2.50β€ββββ β
1.33β€ββββ βββββ2.00β€ββββββββββ β
1.00β€ββββ ββββββββββ βββββββββββββββ1.50β€ββββββββββ β
βββββ ββββββββββ βββββββββββββββ βββββββββββ β
0.67β€ββββ ββββββββββ βββββββββββββββ1.00β€ββββββββββββββ ββββ βββββ
0.33β€ββββ ββββββββββ βββββββββββββββ0.50β€ββββββββββββββ ββββ βββββ
0.00β€βββ ββββββββββ βββββββββββββββ0.00β€βββββββββββββ ββββ βββββ
ββ¬ββββββββ¬βββββββββ¬ββββββββ¬ββββββββ¬β ββ¬ββββββββ¬βββββββββ¬ββββββββ¬ββββββββ¬β
13.3 20.1 26.9 33.7 40.5 -0.2 4.4 8.9 13.5 18.1
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/loss_hist.txt
[2026-01-15 16:25:50,768264][W][ezpz/history:2420:finalize] h5py not found! Saving dataset as netCDF instead.
[2026-01-15 16:25:50,768640][I][utils/init:636:save_dataset] Saving dataset to: /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/dataset_dataset.nc
[2026-01-15 16:25:50,817704][I][ezpz/history:2433:finalize] Saving history report to /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/report.md
>
-
This will be
srunif a Slurm scheduler is detected,mpirun/mpiexecotherwise. ↩