You can have non-interactive processes that run as
cron jobs. However, the
high resource usage (HRU) restriction will still apply. If you're worried about a runaway PHP script, you can use
set_time_limit. More generally, you can force a command to exit with:
Code:
#!/bin/bash
timeout=30
if [ $# -gt 0 ]; then
timeout=$1
fi
/path/to/binary arg0 arg1 ... argN >$HOME/log/job.log 2>&1 &
sleep $timeout && kill $! 2>/dev/null
Replace the "/path/to/binary ... argN" with the command to run, save it as a file in ~/bin and set the cron job command to the file.
Even more generally, you can use the following script (saved as e.g. "~/bin/timelimit") to run an arbitrary command for a limited length of time. Make sure you escape any shell metacharacters when calling the script so they don't get interpreted before timelimit gets a chance to execute the command.
Code:
#!/bin/bash
timeout=30
if [ $1 -ne 0 2>/dev/null ] ; then
timeout=$1
shift
fi
eval "$*" &
sleep $timeout && kill $! 2>/dev/null
Example usage:
Code:
$HOME/bin/timelimit 60 cut -f 1 -d ' ' $HOME/data/activity \| sort -n \| uniq -c \>$HOME/logs/visit_counts
#or:
$HOME/bin/timelimit 60 "cut -f 1 -d ' ' $HOME/data/activity | sort -n | uniq -c >$HOME/logs/visit_counts"
For an example of how you can run multiple, specific commands with a timeout for all, read
Alarms, Timeouts and Parallel Processes in Bash