[rrd-users] trying to understand the relationship between source data, what's in rrd and what gets plotted
Simon Hobson
linux at thehobsons.co.uk
Sat Jul 21 10:08:44 CEST 2007
Mark Seger wrote:
>The thing that's interesting about this whole situation is that on one
>level rrd appears to draw a cleaner graph and the gnuplot one looks a
>little fuzzier, but I also think the gnuplot provides valuable
>information that gets lost, and probably missed with rrd. If my
>examples were disk performance numbers rrd could have led someone to the
>conclusion that everything was running just fine at a load of 20 while
>gnuplot shows there's really a range from 0 to 20 and things are not
>fine. If you zoom into the rrd data you definitely can see see the
>details of the drop off, but my fear is how many people would bother.
>They would see the day long data and think everything is fine.
On the other hand, I can't remember when it was, but I certainly
learned about the basic arithmetic functions of min, max, and average
at school. RRD is simply a tool, as is GNUplot.
What I would say is that if you want to plot every datapoint, as
collected, with no normalisation or consolidation then rrd is
probably not the right tool - you should plot with GNUplot.
What rrd does do (very well) is allow you to collect detailed numbers
and balance the storage and processing requirements vs the need to
keep detailed numbers for a long time - eg collecting every 5 seconds
for a few hours, but dropping the resolution to make it practical to
store data for a whole year.
More information about the rrd-users
mailing list