[rrd-users] Re: retroactively graphing data from logfile (newbie)

Alex van den Bogaerdt alex at ergens.op.het.net
Tue May 24 19:41:34 MEST 2005


On Tue, May 24, 2005 at 01:54:27PM +0200, Ryan Tracey wrote:

> 2005-05-03-12 102518 16 119 278 406 27 9 1145 0 
> 2005-05-03-13 127723 22 273 426 488 29 2 1264 0 
> 
> Where the first column is the time and hour.   The columns that follow
> contain the occurences of a given type in that hour.
> 
> I created the following rrd. 
> 
> rrdtool create hugelog.rrd \
> --start 1114833600 \

Sat Apr 30 06:00:00 2005 METDST

> --step 3600 \

Each hour.  So far, so good.  Updates after apr 30 will be allowed.

> DS:sync_error:GAUGE:3600:0:100000 \
> DS:helo_ip:GAUGE:3600:0:100000 \
> DS:helo_strange:GAUGE:3600:0:100000 \
> DS:sender_verify:GAUGE:3600:0:100000 \
> DS:greylist:GAUGE:3600:0:100000 \
> DS:malform_mime:GAUGE:3600:0:100000 DS:unaccept_attach:GAUGE:3600:0:100000 \
> DS:zipfile:GAUGE:3600:0:100000 \
> DS:malware:GAUGE:3600:0:100000 \

Accept positive rates upto 100000.  Guess that's OK as well.
Heartbeat is 3600:  want updates at most 3600 seconds apart.

> RRA:AVERAGE:0.5:1:24

One "step" per row, 24 rows.  That's not huge, is it?  You have
asked for a database storing exactly one day (24 rows, one hour each).

> rrdtool update hugelog.rrd -t \
> sync_error:\
> helo_ip:\
> helo_strange:\
> sender_verify:\
> greylist:\
> malform_mime:\
> unaccept_attach:\
> zipfile:\
> malware \

I don't think "-t" to "malware" is necessary.  Just make sure
you get the order right and skip that part.

> 1115110805:67846:4:82:157:318:12:1:426:0

I guess you mean "1115110805" is variable __AND__ it is a whole
multiple of 3600 ?

You want to update every 3600 seconds, or sooner.  If one timestamp
is exactly on the hour, and the next timestamp is 5 seconds late,
you'll loose an update.

> Firstly, what am I doing wrong?  I will attach the rrd (hugelog.rrd:
> not sure if the list will accept, though.)

Look at the size of "hugelog":

> Secondly, am I just introducing unneccessary complexity by
> preprocessing the original log file and creating a per-hour file?

Preprocessing?  RRDtool could probably do it for you however you
didn't specify what you did so I can only guess.

> -- Attached file removed by Ecartis and put at URL below --
> -- Type: application/octet-stream
> -- Size: 4k (4764 bytes)
> -- URL : http://lists.ee.ethz.ch/p/hugelog.rrd

Only 4764 bytes.  That isn't huge, that isn't large, that isn't
even average size.  It is _very_ small for an RRDtool file.

HTH
Alex

--
Unsubscribe mailto:rrd-users-request at list.ee.ethz.ch?subject=unsubscribe
Help        mailto:rrd-users-request at list.ee.ethz.ch?subject=help
Archive     http://lists.ee.ethz.ch/rrd-users
WebAdmin    http://lists.ee.ethz.ch/lsg2.cgi



More information about the rrd-users mailing list