[Tip] How to add individual delays to a cron job?

ofIn a big Unix environment you likely install cron jobs like this on a thousand systems:

39 15 * * * { /usr/local/monitoring/sendstats ; } >/dev/null 2>&1

If all the system clocks are synchronized (usually via NTP), these jobs run *exactly* at the same time.
If the cron job accesses a shared resource (network file system, database, send mail, ...) there is a load peak!
The solution is to add individual delays.
Say the delay should be within 60 seconds.
bash and ksh have $RANDOM and arithmetics; the following looks promising

39 15 * * * sleep $((RANDOM % 60)) && { /usr/local/monitoring/sendstats ; } >/dev/null 2>&1

Unfortunately % has a special meaning to "cron". It must be escaped \% , and the shell then sees \% .
This is really an obstacle!

I have found the following solutions:

39 15 * * * perl -e 'sleep rand(60)' && { /usr/local/monitoring/sendstats ; } >/dev/null 2>&1
39 15 * * * sleep `cksum /etc/hosts | awk '{print $1'\%'60}'` && { /usr/local/monitoring/sendstats ; } >/dev/null 2>&1

The latter has a static delay - but still individual.

Do you have other solutions?

1 Like

I am not sure what problem this solves and a number of issues are raised:

  • Big Unix Environment
  • Submitting jobs to thousands of systems
  • load peaks for shared resources

For example, depending on what the task is and the targets, this could just be a matter of updating cron entries on the end systems as opposed to scheduling tasks from the cron of a single system or set of systems. Doing this for a thousand or so systems seems problematic.

I would not do this for a thousand systems, but rather I would either distribute crontabs to remote systems or look for an enterprise sheduling solution (open software; an example of paid-for solutions are products like Control-M).

Enterprise schedulers provide such features as distribution, redundancy and central reporting.

Just a thought.

Maybe you misunderstood my post?
I did not mean the distribution to several systems.
Instead I mean: running a cron job on several systems at the same time.
I have rephrased my post.