[rrd-users] Simple DEF Arithmetic
Johan Elmerfjord
jelmerfj at adobe.com
Tue Aug 6 21:39:50 CEST 2013
Hi Chris,
On Tue, 2013-08-06 at 08:20 -0700, Chris Mason wrote:
> Hi Jonan,
>
>
>
> But when looking at a year, you may have consolidated the
> individual numbers into larger buckets - where the max-values
> are kept.
> Lets just create an example where two values are
> "joined" (timeslot increased to contain 2 values) - so we
> reduce the buckets to 5.
> Most people have maybe 2 weeks of data with full resolution,
> and then consolidate this to larger buckets (from minutes to
> hours) - and then after a few months of them - go up to some
> even larger buckets (hours to days).
>
> With the same max, we would then have:
>
> A2: 3 5 6 4 4
> B2: 6 8 4 3 1
> And our new max-array of the sum would be:
> AB2: 9 13 10 7 5
>
> Where the max-value is 13 - and not 12 as above.
>
>
>
> I see what you are saying, but this would change the MAX of A2 and MAX
> of B2 between different bucket sizes.
> But, if A and B both return the same MAX value for a 1 year resolution
> and a 1 month resolution graph then doesn't this imply they aren't
> being reduced?
>
>
> i.e.
>
>
> 1M MAX of A: 749.807 M
> 1Y MAX of A: 749.807 M
>
>
> 1M MAX of B: 1744.822 M
>
> 1Y MAX of B: 1744.822 M
Even if data is consolidated - you will get the max-values - as your
RRD's are set up to keep these.
But looking on a wider scope - you may get a difference on the sum of
the Max - I think... :-)
Another example. of single samples.
A: 1 2 3 3 2 1
B: 3 2 1 1 2 3
For each of the above - the sum is always 4 - when adding up A and B
numbers.
But when all the 6 values are consolidated to a single value - and you
use MAX - you will end up having 3 for A - and 3 for B.
The sum of these are 6 - (and not 4).
So if larger buckets are created - and the sum of these are calculated
you will end up with a higher number.
For network bandwidth it's normal to look at the 95%-percentile.
At work we do store 5 years of 5-minute data without consolidation.
RRD do however create larger buckets anyway - especially when output is
a graph - as all the 525600 samples can't fit in a graph anyway. (when
looking at 5 years).
When exporting data, I think that it's possible to control this
bucket-consolidation.
I'm not sure how to tackle your problem, but I would still think twice
before using MAX.
/Johan
>
> I'm not sure what you want to accomplish.
> If you are trying to look on some bandwidth-numbers you should
> probably not use max anyway - as even a very short peak would
> offset your hole yearly-data.
>
>
>
> I am measuring link utilisation and we graph MAX as opposed to AVERAGE
> as this gives us more accurate information on peak utilisation. They
> are generally very high capacity links which don't produce bursty
> graphs.
>
>
> Regards,
> Chris
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.oetiker.ch/pipermail/rrd-users/attachments/20130806/61d27cf0/attachment.htm
-------------- next part --------------
A non-text attachment was scrubbed...
Name: face-smile.png
Type: image/png
Size: 1057 bytes
Desc: not available
Url : http://lists.oetiker.ch/pipermail/rrd-users/attachments/20130806/61d27cf0/attachment.png
More information about the rrd-users
mailing list