slurmworkflow solves the issue of running multiple inter-dependant jobs on a slurm equipped HPC without a long lived job or a persistent SSH session.
A workflow is a predefined set of steps (sbatchs) to be executed on an HPC. By default the steps are run sequentially. But slurmworkflow provides tools for altering the execution order, allowing conditional execution of the steps and loop like behavior.
Usage
See the test package and the package vignette for a detailed explanation.