[rrd-users] is this my fault?

Adam Glass rrdtool at adam.clarity.net
Thu Feb 6 02:47:30 MET 2003


I'm seeing some strange behavior from rrdtool that I think the program
ought to handle, but which might actually just be my fault.  You guys
can be the judge...

I wrote a program to retrieve byte counters from firewall rulesets and
graph them by protocol.  The byte counts are COUNTERs, as they are
neither reset upon reads nor a value covering the previous x minutes.

The firewalls have been running for a while, so the byte counts for
each firewall rule are fairly large.  What seems to be happening,
however, is that the first update is compared against zero, meaning
that the delta for the first time period is astronomical.  The deltas
between successive datapoints are normal.  Here's a hypothetical
example of what seems to be happening:

  first read:  byte count is 1000000  delta from last: 1000000
  second read: byte count is 1000050  delta from last: 50
  third read:  byte count is 1000200  delta from last: 150
  etc

This makes auto-scaled graphs all but unreadable until the first
astronomically huge value has disappeared from the time window being
graphed.

Here's an example of a graph where the data has scrolled out of view,
and where things look the way they should:
  http://www.clarity.net/adam/images/temp/daily.png

Here's an example of a graph where this initial huge value is making
it impossible to see the subsequent nominal data:
  http://www.clarity.net/adam/images/temp/weekly.png

And lest you think that there's no data being graphed, here's a
protocol that generated enough data in subsequent time periods to
actually have some data appear:  (Look closely.)
  http://www.clarity.net/adam/images/temp/weekly-http.png

So, is this my fault?  Shouldn't rrdtool realize that it doesn't
have any previous data, and not calculate the rate of change by
comparing the initial data to zero?

There are a couple of things that I could do to prevent this,
all of which are fraught with unpleasant consequences:
  - set a maximum value when creating the DS
  - set a fixed (and small) range of values in the Y axis
  - use a CDEF to limit the values when graphing
  - write a wrapper script to calculate the rates of change and
    only output a non-zero rate of change when it has a valid
    previous value against which to calculate a delta
  - reset the firewall rule byte counts when creating the database
  - be patient and wait for the large values to leave the
    graphing window  :)

None of those are good options.  Am I doing something wrong?

Adam

--
Unsubscribe mailto:rrd-users-request at list.ee.ethz.ch?subject=unsubscribe
Help        mailto:rrd-users-request at list.ee.ethz.ch?subject=help
Archive     http://www.ee.ethz.ch/~slist/rrd-users
WebAdmin    http://www.ee.ethz.ch/~slist/lsg2.cgi



More information about the rrd-users mailing list