Job Submitter
The job_submitter
allows to execute a parametric study using a script mask and a dictionary of parameters to replace in this mask, from the command line.
These parameters must be present in the given mask in the %(PARAMETER)s
format (other types apart from string are also allowed).
The type of script and executable is freely choosable, but defaults to madx
, for which this submitter was originally written.
When submitting to HTCondor
, data to be transferred back to the working directory must be written in a sub-folder defined by job_output_directory
which defaults to Outputdata.
This script also allows to check if all HTCondor
jobs finished successfully, for resubmissions with a different parameter grid, and for local execution.
A Jobs.tfs file is created in the working directory containing the Job Id, parameter per job
and job directory for further post processing.
For additional information and guides, see the Job Submitter page in the OMC
documentation site.
--Required--
mask (PathOrStr):
Program mask to use
replace_dict (DictAsString):
Dict containing the str to replace as keys and values a list of parameters to replace
working_directory (PathOrStr):
Directory where data should be put
--Optional--
append_jobs:
Flag to rerun job with finer/wider grid, already existing points will not be reexecuted.
action:
store_true
check_files (str):
List of files/file-name-masks expected to be in the ‘job_output_dir’ after a successful job (for appending/resuming). Uses the ‘glob’ function, so unix-wildcards (*) are allowed. If not given, only the presence of the folder itself is checked.
dryrun:
Flag to only prepare folders and scripts, but does not start/submit jobs. Together with resume_jobs this can be use to check which jobs succeeded and which failed.
action:
store_true
executable (PathOrStr):
Path to executable or job-type (of [‘madx’, ‘python3’, ‘python2’]) to use.
default:
madx
htc_arguments (DictAsString):
Additional arguments for htcondor, as Dict-String. For AccountingGroup please use ‘accounting_group’. ‘max_retries’ and ‘notification’ have defaults (if not given). Others are just passed on.
default:
{}
job_output_dir (str):
The name of the output dir of the job. (Make sure your script puts its data there!)
default:
Outputdata
jobflavour (str):
Jobflavour to give rough estimate of runtime of one job
choices:
('espresso', 'microcentury', 'longlunch', 'workday', 'tomorrow', 'testmatch', 'nextweek')
default:
workday
jobid_mask (str):
Mask to name jobs from replace_dict
num_processes (int):
Number of processes to be used if run locally
default:
4
output_destination (PathOrStr):
Directory to copy the output of the jobs to, sorted into folders per job. Can be on EOS, preferrably via EOS-URI format (‘root://eosuser.cern.ch//eos/…’).
resume_jobs:
Only do jobs that did not work.
action:
store_true
run_local:
Flag to run the jobs on the local machine. Not suggested.
action:
store_true
script_arguments (DictAsString):
Additional arguments to pass to the script, as dict in key-value pairs (‘--’ need to be included in the keys).
default:
{}
script_extension (str):
New extension for the scripts created from the masks. This is inferred automatically for [‘madx’, ‘python3’, ‘python2’]. Otherwise not changed.
ssh (str):
Run htcondor from this machine via ssh (needs access to the working_directory)
- pylhc_submitter.job_submitter.check_opts(opt)[source]
Checks options and sorts them into job-creation and running parameters.