[rrd-users] Huge 'outlier' in a graph rendering rest of graph unreadable
David Ball
davidtball at gmail.com
Thu Apr 23 17:21:03 CEST 2009
Spent a while with the docs and list archives trying to sort this
one out, but didn't find what I was looking for. I have to think this
has happened to someone else...
I have several routers which write interface bandwidth counter data
to a flat file every 5mins. At the end of a month I FTP the data
files to a server, read through the data file and dump it into an .rrd
and ultimately generate a graph. I'm not interested in exploring
better methods of gathering this data right now (ie. SNMP), so let's
leave that alone for now (plans are already underway).
A couple of weeks ago, a process running on one of the routers
restarted itself, and resulted in the interface counter values being
reset to 0. As such, the counter value went from, say, 35000000000 in
one reading to 16000 in the next. While the parsing script can detect
these types of things and 'discard' the sample when calculating the
total bytes in/out on the interface, the value was still graphed,
resulting in a spike to 125Gbps in the graph (obviously impossible).
Sound familiar to anyone? And guidance you can provide? I figured
simply not adding the sample to the .rrd might help, but it'll then
just average the NEXT value, which may only be 32000.
thanks in advance,
David
More information about the rrd-users
mailing list