Cron <root at minion> run-parts /etc/cron.minute
Cron Daemon
root at minion.xxxx.com
Mon Jul 23 06:03:02 MEST 2001
/etc/cron.minute/update.wlo:
ERROR: expected 3 data source readings (got 4) from N:1:1.48:::...
My script which runs once every 60 seconds --
[root at minion cron.minute]# cat update.wlo
master=`rsh wlo /bin/totinf` #returns 3 lines of data
#echo "$master"
ld=`echo "$master"|gawk '{ if (NR == 1) print $0 }'`
#echo "$ld"
users=`echo "$master"|gawk '{ if (NR == 2) print $0 }'`
#echo "$users"
tout=`echo "$master"|gawk '{ if (NR == 3) print $0 }'`
#echo "$tout"
rrdtool update /usr/local/rrdwork/wlo.rrd N:$ld:$users:$tout
#echo $ld $users $tout
---------------------------------------+
My rules for my rrd are as follows:
rrdtool create --step 60 wlo.rrd \
DS:load:GAUGE:600:0:100 \
DS:users:GAUGE:120:0:120 \
DS:tout:GAUGE:120:0:200 \
RRA:AVERAGE:0.5:1:600 \
RRA:AVERAGE:0.5:6:700 \
RRA:AVERAGE:0.5:24:775 \
RRA:AVERAGE:0.5:288:797 \
RRA:MAX:0.5:1:600 \
RRA:MAX:0.5:6:700 \
RRA:MAX:0.5:24:775 \
RRA:MAX:0.5:288:797
My rrd isn't optimized very well. I need to gather enough samples for 1
year, 6 month, 1 month,1 week and 1 hour without becoming too large. I had
problems following the examples for the archives consolidation functions and
where the x-files factor could be derived. I will be storing 40 databases
and need to not exceed 100 megs for a year of data.
I look forward to your responses and thank you for your help in
advance.. =)
Sincerely,
Eric Collins
--
Unsubscribe mailto:rrd-users-request at list.ee.ethz.ch?subject=unsubscribe
Help mailto:rrd-users-request at list.ee.ethz.ch?subject=help
Archive http://www.ee.ethz.ch/~slist/rrd-users
WebAdmin http://www.ee.ethz.ch/~slist/lsg2.cgi
More information about the rrd-users
mailing list