You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Summary:
Allows disabling the storage of checkpoints related to torchft.
Users don't really have to rely on any external storage. So it reduces
set up time to get things up and running. Since we also don't really
need model checkpoints when we have torchft. And if checkpoint storage
has issues, this can work as a killswitch to completely disable the
storage so it doesn't impact training.
---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/pytorch/torchtitan/pull/1810).
* #1856
* #1811
* __->__ #1810
Co-authored-by: Tushar Jain <[email protected]>
0 commit comments