New way of dealing with duplicated jobs
We now allow storing them, but at runtime they are aborted. This is achieved by computing a checksum of the parameters hash. In fact, this allows to detect cases where different stacks lead to the same final hash ! There is an additional attempt status, PARAM_CHECK, which is used when eHive is checking whether the job has already been selected, and a new job status, REDUNDANT, when a job is discarded.
Showing with 154 additions and 135 deletions