Skip to content

πŸ“– Referenceβš“οΈŽ

This page covers the launcher details and a complete runnable example with metric tracking. For installation, the API cheat sheet, and getting started, see the Quick Start.

πŸš€ Scheduler-Aware Launcher: ezpz launchβš“οΈŽ

For complete CLI usage, flags, and the sequence diagram, see the ezpz launch CLI reference.

  • Scheduler smarts: detects PBS / Slurm automatically! ezpz launch will, by default, determine the appropriate launcher based on the detected job scheduler environment.

    • Sensible Fallback: Sensible fallback to mpirun -np when running / testing locally
  • Flexible resource specification: -np, -ppn, --nhosts, --hostfile, etc. Including the ability to pass custom resource flags like -np, --nhosts, --hostfile, and other scheduler-specific options.

  • Pass-through arguments: Pass any additional flags through to the underlying launcher. For launcher-only flags/env (e.g., -x FOO=bar), place them before --; everything after -- is the command to run:

    ezpz launch <launch flags> -- <command to run> <command args>
    
    Launcher Examples

    To pass arguments through to the launcher1

    $ ezpz launch -- python3 -m ezpz.examples.fsdp
    
    # pass --line-buffer through to mpiexec:
    $ ezpz launch --line-buffer -- python3 \
          -m ezpz.examples.vit --compile --fsdp
    
    # Create and use a custom hostfile
    $ head -n 2 "${PBS_NODEFILE}" > hostfile0-2
    $ ezpz launch --hostfile hostfile0-2 -- python3 \
        -m ezpz.examples.fsdp_tp
    
    # use explicit np/ppn/nhosts
    $ ezpz launch \
          -np 4 \
          -ppn 2 \
          --nhosts 2 \
          --hostfile hostfile0-2 \
          -- \
          python3 -m ezpz.examples.diffusion
    
    # forward the PYTHONPATH environment variable
    $ ezpz launch -x PYTHONPATH=/tmp/.venv/bin:${PYTHONPATH} \
          -- \
          python3 -m ezpz.examples.fsdp
    

For the API cheat sheet (before/after diffs for setup, device management, model wrapping, training loop, and metric tracking), see the Quick Start.

βœ… Complete Example with Historyβš“οΈŽ

Capture metrics across all ranks, persist JSONL, generate text/PNG plots, and (when configured) log to Weights & Biasesβ€”no extra code on worker ranks. The History class aggregates distributed statistics (min/max/mean/std) and produces terminal-friendly plots automatically via finalize().

example.py
import ezpz
import torch

from ezpz.models.minimal import SequentialLinearNet  # multi-layer Linear+ReLU network

import time

logger = ezpz.get_logger(__name__)

rank = ezpz.setup_torch()
device = ezpz.get_torch_device()
model = SequentialLinearNet(
    input_dim=16,
    output_dim=32,
    sizes=[4, 8, 12]
)
model.to(device)
optimizer = torch.optim.AdamW(model.parameters())

history = ezpz.History()

for i in range(10):
    t0 = time.perf_counter()
    batch = torch.randn(1, 16)
    batch = batch.to(device)
    output = model(batch)
    pred = torch.randn(output.shape)
    loss = ((output - pred.to(device)) ** 2).sum()
    loss.backward()
    optimizer.step()
    logger.info(
        history.update(
            {
                "iter": i,
                "loss": loss,
                "dt": time.perf_counter() - t0,
            }
        )
    )

if rank == 0:
    history.finalize()

ezpz.cleanup()

Swap in your own model

SequentialLinearNet is a small multi-layer Linear+ReLU network included for demonstration. Replace it with any torch.nn.Module β€” the rest of the script (setup, wrapping, training loop, history) stays the same.

πŸͺ΅ Logs
Single Process

Launching in a single process via python:


> python3 example.py
[2026-01-15 16:29:59,463919][I][ezpz/dist:1451:setup_torch_distributed] Using device=mps with backend=gloo
[2026-01-15 16:29:59,475974][I][ezpz/dist:1316:setup_torch_DDP] Caught MASTER_PORT=61496 from environment!
[2026-01-15 16:29:59,477538][I][ezpz/dist:1332:setup_torch_DDP] Using torch.distributed.init_process_group with
- master_addr='Sams-MacBook-Pro-2.local'
- master_port='61496'
- world_size=1
- rank=0
- local_rank=0
- timeout=datetime.timedelta(seconds=3600)
- backend='gloo'
[2026-01-15 16:29:59,478263][I][ezpz/dist:964:init_process_group] Calling torch.distributed.init_process_group_with: rank=0 world_size=1 backend=gloo
[2026-01-15 16:29:59,789459][I][ezpz/dist:1699:setup_torch] Using device='mps' with backend='gloo' + 'gloo' for distributed training.
[2026-01-15 16:29:59,872685][W][ezpz/dist:502:print_dist_setup] Using [1 / 1] available "mps" devices !!
[2026-01-15 16:29:59,873382][I][ezpz/dist:1746:setup_torch] ['Sams-MacBook-Pro-2.local'][device='mps'][node=0/0][rank=0/0][local_rank=0/0]
[2026-01-15 16:30:01,875023][I][ezpz/history:214:init] Not using distributed metrics! Will only be tracked from a single rank...
[2026-01-15 16:30:01,875595][I][ezpz/history:220:init] Using History with distributed_history=False
[2026-01-15 16:30:02,316946][I][ezpz/example:30:<module>] iter=0 loss=31.003010 dt=0.435792
[2026-01-15 16:30:02,330593][I][ezpz/example:30:<module>] iter=1 loss=57.543598 dt=0.008874
[2026-01-15 16:30:02,337684][I][ezpz/example:30:<module>] iter=2 loss=28.547897 dt=0.003079
[2026-01-15 16:30:02,346325][I][ezpz/example:30:<module>] iter=3 loss=22.243866 dt=0.002852
[2026-01-15 16:30:02,353276][I][ezpz/example:30:<module>] iter=4 loss=25.085716 dt=0.003102
[2026-01-15 16:30:02,359662][I][ezpz/example:30:<module>] iter=5 loss=27.327484 dt=0.002849
[2026-01-15 16:30:02,364890][I][ezpz/example:30:<module>] iter=6 loss=19.950121 dt=0.003308
[2026-01-15 16:30:02,371596][I][ezpz/example:30:<module>] iter=7 loss=36.892731 dt=0.005253
[2026-01-15 16:30:02,378344][I][ezpz/example:30:<module>] iter=8 loss=28.500504 dt=0.002372
[2026-01-15 16:30:02,384270][I][ezpz/example:30:<module>] iter=9 loss=33.020760 dt=0.002239
/Users/samforeman/vibes/saforem2/ezpz/src/ezpz/history.py:2223: UserWarning: Converting a tensor with requires_grad=True to a scalar may lead to unexpected behavior.
Consider using tensor.detach() first. (Triggered internally at /Users/runner/work/pytorch/pytorch/pytorch/torch/csrc/autograd/generated/python_variable_methods.cpp:837.)
x = torch.Tensor(x).numpy(force=True)
[2026-01-15 16:30:02,458225][I][ezpz/history:2385:finalize] Saving plots to /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/plots/mplot (matplotlib) and /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/plots/tplot (tplot)
[2026-01-15 16:30:03,822720][I][ezpz/tplot:321:tplot] Using plot type: line
[2026-01-15 16:30:03,823148][I][ezpz/tplot:323:tplot] Using plot marker: hd
                         dt vs iter                     
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 0.436β”€β–Œ β”‚ β”‚β–š β”‚ 0.364─▝▖ β”‚ β”‚ β–Œ β”‚ β”‚ ▐ β”‚ 0.291─ β–Œ β”‚ β”‚ β–š β”‚ 0.219─ ▝▖ β”‚ β”‚ β–š β”‚ 0.147─ ▐ β”‚ β”‚ β–Œ β”‚ β”‚ ▐ β”‚ 0.074─ ▝▖ β”‚ β”‚ β–š β”‚ 0.002─ ▝▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄│ β””β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”˜ 0 1 2 3 4 5 6 7 8 9
dt iter
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/plots/tplot/dt.txt [2026-01-15 16:30:03,827907][I][ezpz/tplot:321:tplot] Using plot type: hist [2026-01-15 16:30:03,828187][I][ezpz/tplot:323:tplot] Using plot marker: hd freq vs dt
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 9.0β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ 7.5β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ 6.0β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ 4.5β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ 3.0β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ 1.5β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚ 0.0β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ -0.02 0.10 0.22 0.34 0.46 freq dt
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/plots/tplot/dt-hist.txt [2026-01-15 16:30:03,833010][I][ezpz/tplot:321:tplot] Using plot type: line [2026-01-15 16:30:03,833296][I][ezpz/tplot:323:tplot] Using plot marker: hd loss vs iter
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 57.5─ β–—β–Œ β”‚ β”‚ β–Œβ– β”‚ 51.3─ ▐ β–Œ β”‚ β”‚ β–—β–˜ ▐ β”‚ β”‚ β–ž β–Œ β”‚ 45.0─ β–—β–˜ ▝▖ β”‚ β”‚ β–Œ β–š β”‚ 38.7─ ▐ ▝▖ β”‚ β”‚β–—β–˜ β–š β–žβ–„ β”‚ 32.5β”€β–ž ▝▖ β–ž β–€β–„ β–—β”‚ β”‚β–˜ β–š β–ž β–€β–„ β–—β–„β–žβ–€β–˜β”‚ β”‚ β–šβ–– β–— β–ž β–€β–€β–˜ β”‚ 26.2─ β–β–šβ–„ β–„β–„β–„β–€β–€β–˜β–€β–„ β–ž β”‚ β”‚ β–€β–„β–„β–„β–„β–€β–€β–€ β–€β–„ β–ž β”‚ 20.0─ β–€β–Ÿ β”‚ β””β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”˜ 1 2 3 4 5 6 7 8 9
loss iter
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/plots/tplot/loss.txt [2026-01-15 16:30:03,837141][W][ezpz/history:2420:finalize] h5py not found! Saving dataset as netCDF instead. [2026-01-15 16:30:03,837503][I][utils/init:636:save_dataset] Saving dataset to: /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/dataset_dataset.nc [2026-01-15 16:30:03,885343][I][ezpz/history:2433:finalize] Saving history report to /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-163002/2026-01-15-163002/report.md >

ezpz launch

Launching via ezpz launch (fallback with 2 processes on MacBookPro):


> ezpz launch python3 /tmp/test.py
[2026-01-15 16:25:45,611138][I][ezpz/launch:515:run] No active scheduler detected; falling back to local mpirun: mpirun -np 2 python3 /tmp/test.py
[2026-01-15 16:25:47,138854][I][ezpz/dist:1451:setup_torch_distributed] Using device=mps with backend=gloo
[2026-01-15 16:25:47,149140][I][ezpz/dist:1316:setup_torch_DDP] Caught MASTER_PORT=60839 from environment!
[2026-01-15 16:25:47,150476][I][ezpz/dist:1332:setup_torch_DDP] Using torch.distributed.init_process_group with
- master_addr='Sams-MacBook-Pro-2.local'
- master_port='60839'
- world_size=2
- rank=0
- local_rank=0
- timeout=datetime.timedelta(seconds=3600)
- backend='gloo'
[2026-01-15 16:25:47,151050][I][ezpz/dist:964:init_process_group] Calling torch.distributed.init_process_group_with: rank=0 world_size=2 backend=gloo
[2026-01-15 16:25:47,242104][I][ezpz/dist:1699:setup_torch] Using device='mps' with backend='gloo' + 'gloo' for distributed training.
[2026-01-15 16:25:47,261869][I][ezpz/dist:1746:setup_torch] ['Sams-MacBook-Pro-2.local'][device='mps'][node=0/0][rank=1/1][local_rank=1/1]
[2026-01-15 16:25:47,289930][W][ezpz/dist:502:print_dist_setup] Using [2 / 2] available "mps" devices !!
[2026-01-15 16:25:47,290348][I][ezpz/dist:1746:setup_torch] ['Sams-MacBook-Pro-2.local'][device='mps'][node=0/0][rank=0/1][local_rank=0/1]
[2026-01-15 16:25:48,882995][I][ezpz/history:220:init] Using History with distributed_history=True
[2026-01-15 16:25:49,293872][I][tmp/test:30:<module>] iter=0 loss=14.438349 dt=0.383613 loss/mean=18.930481 loss/max=23.422613 loss/min=14.438349 loss/std=4.492133 dt/mean=0.383651 dt/max=0.383690 dt/min=0.383613 dt/std=0.000000
[2026-01-15 16:25:49,310545][I][tmp/test:30:<module>] iter=1 loss=38.289841 dt=0.006327 loss/mean=37.768768 loss/max=38.289841 loss/min=37.247700 loss/std=0.521159 dt/mean=0.006445 dt/max=0.006563 dt/min=0.006327 dt/std=0.000118
[2026-01-15 16:25:49,323389][I][tmp/test:30:<module>] iter=2 loss=15.649942 dt=0.003752 loss/mean=26.894470 loss/max=38.138996 loss/min=15.649942 loss/std=11.244525 dt/mean=0.003934 dt/max=0.004116 dt/min=0.003752 dt/std=0.000182
[2026-01-15 16:25:49,335400][I][tmp/test:30:<module>] iter=3 loss=21.518583 dt=0.006340 loss/mean=38.892834 loss/max=56.267082 loss/min=21.518583 loss/std=17.374252 dt/mean=0.006604 dt/max=0.006869 dt/min=0.006340 dt/std=0.000264
[2026-01-15 16:25:49,343467][I][tmp/test:30:<module>] iter=4 loss=43.398060 dt=0.003205 loss/mean=41.371902 loss/max=43.398060 loss/min=39.345749 loss/std=2.026196 dt/mean=0.002617 dt/max=0.003205 dt/min=0.002029 dt/std=0.000588
[2026-01-15 16:25:49,351912][I][tmp/test:30:<module>] iter=5 loss=43.348061 dt=0.002345 loss/mean=39.714069 loss/max=43.348061 loss/min=36.080078 loss/std=3.633997 dt/mean=0.002180 dt/max=0.002345 dt/min=0.002014 dt/std=0.000166
[2026-01-15 16:25:49,360378][I][tmp/test:30:<module>] iter=6 loss=40.937546 dt=0.003073 loss/mean=36.756641 loss/max=40.937546 loss/min=32.575737 loss/std=4.180907 dt/mean=0.002433 dt/max=0.003073 dt/min=0.001794 dt/std=0.000640
[2026-01-15 16:25:49,368605][I][tmp/test:30:<module>] iter=7 loss=30.643730 dt=0.002785 loss/mean=32.207088 loss/max=33.770447 loss/min=30.643730 loss/std=1.563398 dt/mean=0.002315 dt/max=0.002785 dt/min=0.001844 dt/std=0.000470
[2026-01-15 16:25:49,377235][I][tmp/test:30:<module>] iter=8 loss=26.110786 dt=0.003046 loss/mean=33.217815 loss/max=40.324844 loss/min=26.110786 loss/std=7.107031 dt/mean=0.002361 dt/max=0.003046 dt/min=0.001676 dt/std=0.000685
[2026-01-15 16:25:49,384409][I][tmp/test:30:<module>] iter=9 loss=22.861826 dt=0.001886 loss/mean=25.471987 loss/max=28.082148 loss/min=22.861826 loss/std=2.610158 dt/mean=0.002179 dt/max=0.002472 dt/min=0.001886 dt/std=0.000293
/Users/samforeman/vibes/saforem2/ezpz/src/ezpz/history.py:2223: UserWarning: Converting a tensor with requires_grad=True to a scalar may lead to unexpected behavior.
Consider using tensor.detach() first. (Triggered internally at /Users/runner/work/pytorch/pytorch/pytorch/torch/csrc/autograd/generated/python_variable_methods.cpp:837.)
x = torch.Tensor(x).numpy(force=True)
[2026-01-15 16:25:49,455888][I][ezpz/history:2385:finalize] Saving plots to /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/mplot (matplotlib) and /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot (tplot)
                    dt                                    dt/min
     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
0.384β”€β–Œ                                β”‚0.384─-                                β”‚
0.320─▐                                β”‚0.129─ --------------------------------β”‚
0.256─ β–š                               β”‚     β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜
0.129─ ▝▖                              β”‚     1.0     3.2     5.5     7.8   10.0 
0.066─  ▐                              β”‚dt/min              iter
0.002─   β–šβ–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β–„β”‚                    dt/std
     β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜       β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
     1.0     3.2     5.5     7.8   10.0 0.00068─             *      *      *   β”‚
dt                  iter                0.00046─       ****** **   * ****** ***β”‚
                dt/mean                 0.00011─*******         ***            β”‚
     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”       β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”¬β”˜
0.384─·                                β”‚       1.0     3.2    5.5     7.8  10.0 
0.320─·                                β”‚dt/std               iter
0.256─ Β·                               β”‚                   dt/max
0.193─  Β·                              β”‚     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
0.129─  Β·                              β”‚0.384─+                                β”‚
0.066─   Β·                             β”‚0.257─ ++                              β”‚
0.002─    Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·Β·β”‚0.066─   ++++++++++++++++++++++++++++++β”‚
     β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜     β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜
    1.0     3.2     5.5     7.8   10.0      1.0     3.2     5.5     7.8   10.0 
dt/mean             iter                dt/max              iter              
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/dt.txt β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 0.384─ ++ dt/max β”‚ β”‚ -- dt/min β”‚ β”‚ Β·Β· dt/mean β”‚ 0.320─ β–žβ–ž dt β”‚ β”‚ ▐ β”‚ β”‚ β–Œ β”‚ 0.256─ β–š β”‚ β”‚ ▝▖ β”‚ β”‚ β–Œ β”‚ 0.193─ ▐ β”‚ β”‚ β–Œ β”‚ β”‚ ▐ β”‚ β”‚ ▝▖ β”‚ 0.129─ β–š β”‚ β”‚ ▐ β”‚ β”‚ β–Œ β”‚ 0.065─ ▐ β”‚ β”‚ ▝▖ β”‚ β”‚ β–š β”‚ 0.002─ ▝▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄│ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ 1.0 3.2 5.5 7.8 10.0 text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/dt_summary.txt dt/mean hist dt/max hist
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 9.0β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚9.0β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚ 7.5β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚7.5β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚ 6.0β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚6.0β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚ 4.5β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚4.5β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚ 3.0β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚3.0β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚ 1.5β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚1.5β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ 0.0β”€β–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚0.0β”€β–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ -0.01 0.09 0.19 0.30 0.40 -0.01 0.09 0.19 0.30 0.40 dt/min hist dt/std hist β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 9.0β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚2.00─ β–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ 7.5β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚1.67─ β–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ 6.0β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚1.33─ β–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ 4.5β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚1.00β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆ β”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚ 3.0β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚0.67β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚ 1.5β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚0.33β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚ 0.0β”€β–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚0.00β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜ -0.02 0.09 0.19 0.30 0.40 -0.00003 0.00016 0.00034 0.00053
text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/dt_hist.txt loss loss/min
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 43.4─ β–—β–€β–€β–€β–€β–„β–„β–„β–„ β”‚39.3─ - ------------ β”‚ 38.6─ β–Ÿ β–—β–˜ β–šβ–– β”‚22.7─---- ---------- -------β”‚ 33.7─ β–ž β–š β–—β–˜ β–β–šβ–– β”‚ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ 24.1─ ▐ β–š β–—β–˜ β–β–€β–šβ–„β–– β”‚ 1.0 3.2 5.5 7.8 10.0 19.3β”€β–—β–˜ β–š β–„β–˜ ▝▀▀▀│loss/min iter 14.4β”€β–Œ β–šβ–„β–€ β”‚ loss/std β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 1.0 3.2 5.5 7.8 10.0 17.4─ * β”‚ loss iter 11.8─ **** ** * β”‚ loss/mean 3.3─******* *************** ****β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ 41.4─ Β·Β·Β·Β· β”‚ 1.0 3.2 5.5 7.8 10.0 37.6─ Β· Β·Β·Β·Β· Β·Β·Β·Β· β”‚loss/std iter
33.9─ Β· Β· Β· Β·Β·Β·Β·Β·Β·Β· β”‚ loss/max
30.2─ Β· Β· Β· Β·Β· β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 26.4─ Β· Β·Β· Β·Β·β”‚56.3─ + β”‚ 22.7─· β”‚45.3─ +++++++ ++++++++++++++++++ β”‚ 18.9─· β”‚28.9─++++ ++++β”‚ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ 1.0 3.2 5.5 7.8 10.0 1.0 3.2 5.5 7.8 10.0 loss/mean iter loss/max iter text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/loss.txt β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 56.3─ ++ loss/max + β”‚ β”‚ -- loss/min + + β”‚ β”‚ Β·Β· loss/mean + + β”‚ 49.3─ β–žβ–ž loss + ++ β”‚ β”‚ + + β”‚ β”‚ + + β”‚ 42.3─ + +β–žβ–€β–€β–€β–€β–€β–€β–€β–€β–šβ–„β–„β–„β–– β”‚ β”‚ + β–žΒ· +++β–β–€β–€β–€β–š + β”‚ β”‚ β––++++++++ ······▐·-Β·Β·Β·Β·Β·Β·Β·Β·Β· β–€β–– ++ + β”‚ 35.4─ β–žβ–šΒ· Β· β–—β–˜- ---------Β·Β·Β·Β·Β·Β·Β·Β· β–β–šβ––+ +++ + β”‚ β”‚ ▐--β–šΒ·Β· Β·Β· β–—β–˜- ---- ···▝▄+++++ Β· ++ β”‚ β”‚ β–—β–˜ -β–Œ Β· Β· β–ž- ---- Β·β–šβ––Β·Β·Β·Β·Β·Β·Β·Β· Β·Β· + β”‚ β”‚ Β·β–Œ ▝▖ Β·Β· Β·Β· β–ž- ------β–β–šβ–„β–– Β·Β· + β”‚ 28.4─ Β·β–ž ▝▖ Β·Β·Β· ▐- -β–β–€β–šβ–„β–– Β·Β·++β”‚ β”‚ Β·β–—β–˜ β–š β–—β–˜ -▝▀▀▄▄▖ Β·Β·β”‚ β”‚+Β·β–—β–˜ β–š β–—β–˜ --▝▀▀▄▄▄│ 21.4─· β–ž β–Œ β–—β–ž β”‚ β”‚Β·β–ž ▝▖ β–—β–„β–€β–˜ β”‚ β”‚β–—β–˜ ▝▖-β–„β–žβ–˜ β”‚ 14.4β”€β–Œ ▝▀ β”‚ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ 1.0 3.2 5.5 7.8 10.0 text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/loss_summary.txt loss/mean hist loss/max hist
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 2.00─ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚2.00─ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ 1.67─ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚1.67─ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ 1.33─ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚1.33─ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ 1.00β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚1.00β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ 0.67β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚0.67β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ 0.33β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚0.33β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ 0.00β”€β–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚0.00β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ 17.9 24.0 30.2 36.3 42.4 22.0 30.9 39.8 48.8 57.7 loss/min hist loss/std hist
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 2.00β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚3.00β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚ 1.67β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚2.50β”€β–ˆβ–ˆβ–ˆβ–ˆ β”‚ 1.33β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚2.00β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ 1.00β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚1.50β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚ β”‚β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β”‚ 0.67β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚1.00β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ 0.33β”€β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚0.50β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ 0.00β”€β–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ”‚0.00β”€β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆ β–ˆβ–ˆβ–ˆβ–ˆβ”‚ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ β””β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”¬β”˜ 13.3 20.1 26.9 33.7 40.5 -0.2 4.4 8.9 13.5 18.1 text saved in /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/plots/tplot/loss_hist.txt [2026-01-15 16:25:50,768264][W][ezpz/history:2420:finalize] h5py not found! Saving dataset as netCDF instead. [2026-01-15 16:25:50,768640][I][utils/init:636:save_dataset] Saving dataset to: /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/dataset_dataset.nc [2026-01-15 16:25:50,817704][I][ezpz/history:2433:finalize] Saving history report to /Users/samforeman/vibes/saforem2/ezpz/outputs/History-2026-01-15-162549/2026-01-15-162549/report.md >


  1. This will be srun if a Slurm scheduler is detected, mpirun / mpiexec otherwise.