( dir = "./where_to_store_logs_and_checkpoints", parser = parser, exclude =, use_underscore = True, # flags are -batch_size vs. Here is a template for the train.py file: import argparse from dora import argparse_main, get_xp parser = argparse. The train.py file must contain a main function that is properly decorated, as explained hereafter. ain module, stored in the myproj/train.py file.) In all cases, you must have a specific python package (which we will call here myproj), With Pytorch Lightning for projects that uses it. On top of that, Dora provides a smooth integration In order to derive the XP signature, Dora must know about the configuration schema your project is following, as well asĭora supports two backends for that : argparse, and hydra. It is always possible to retrieve the last Slurm job that was associated with it.
0 Comments
Leave a Reply. |