[rrd-users] Bits to Bytes.. trying to figure out what's happening

Westlake, Simon simon.westlake at twcable.com
Mon Mar 19 14:07:03 CET 2007

I run MRTG to graph various statistics on routers, and I recently
switched to using php4rrdtool to generate the graphs rather than
14all.cgi as I needed some features that I didn't really have without
generating them myself.

So, I'm now running into a couple of problems with the graphs I'm trying
to understand. I ran some tests yesterday with a packet generator over a
gigabit link I'm graphing and received some results I didn't expect.
Over the majority of the graphs I'm generating, the typical throughput
is between five and fifty megs, and those graphs appear to be accurately
displayed in bits. I'm using the following code to generate the graphs
(some HTML etc clipped):

$opts = array( "--start", "$date3", "--end", "$date4",  "-v Throughput",
"-t Total Data Throughput",
          "AREA:bitsin#009900:Incoming Traffic",
          "LINE2:bitsout#0000ff:Outgoing Traffic",
          "PRINT:bitsin:MAX:Max Inbound\: %1.1lf%s",
          "PRINT:bitsout:MAX:Outbound\: %1.1lf%s",
          "PRINT:bitsin:LAST:Current Inbound\: %1.1lf%s",
          "PRINT:bitsout:LAST:Current Outbound\: %1.1lf%s",
          "PRINT:bitsin:AVERAGE:Average Inbound\: %1.1lf%s",
          "PRINT:bitsout:AVERAGE:Average Outbound\: %1.1lf%s"

So, as you can see, I'm taking the ifin/outoctets and multiplying by 8
to get bits, then drawing an area and a line to show the input/output.

In my gig test yesterday, I was hitting between 930 and 970 megs per
second on the packet generator, and the 5 minute input/output rate on
the switch interface confirmed this to be accurate. However, my graphs
show anywhere between 100 and 115 megs. At first, I wondered if it was
some type of 'MaxBytes' problem in MRTG, so I changed that value to a
very large amount and it didn't rectify it. I then realized that 115
megabytes was ~964 megabit.

So, I'm now assuming that the commands I'm using to generate my graphs
are, at some value threshold, converting them to bytes. I've looked
through the RRD documentation to try and see if there is a way I can
perform these calculations myself, but I can't see a simple way to do it
without an if/else statement (if bitsin>1000, /1000, if it's still
>1000, /1000 again.. etc)

Do you think my hypothesis of what is happening is correct and if so,
can anyone suggest an easy way to rectify this problem?

Simon Westlake
Time Warner Cable Business Class
Network Engineer
Ph: 414.908.4791 | Cell: 414.688.7956

This E-mail and any of its attachments may contain Time Warner
Cable proprietary information, which is privileged, confidential,
or subject to copyright belonging to Time Warner Cable. This E-mail
is intended solely for the use of the individual or entity to which
it is addressed. If you are not the intended recipient of this
E-mail, you are hereby notified that any dissemination,
distribution, copying, or action taken in relation to the contents
of and attachments to this E-mail is strictly prohibited and may be
unlawful. If you have received this E-mail in error, please notify
the sender immediately and permanently delete the original and any
copy of this E-mail and any printout.

More information about the rrd-users mailing list