[rrd-users] rrdtool memory usage and the Raspberry Pi
jared.henley at pelena.com.au
Fri Oct 16 06:39:37 CEST 2015
I've been working on a logging application that uses rrdtool. It works
brilliantly on my PC, not so good on the Raspberry Pi model B+. But
before I go there, there's something interesting I noticed while working
on the PC.
I'm creating a database with the following command (in rrdpython)
<snip - there are 29 definitions all up, all the same>
This creates a 386MB file.
I didn't think too hard about it, until I did a graph to a CSV file. Of
course the CSV would be less space-efficient than the binary rrd file,
right? But the generated CSV turns out to be only 15MB?
rrdtool seems to use 64-bit integers for everything, so I figure the rrd
file above should use:
8 bytes per RRA entry * 29 RRAs + 1 timestamp * 60,000 locations in the
round-robin (5000 minutes / 5 seconds) = 14.4MB. I'm confused. Why is
the rrd file 30 times bigger than I'd expect? I did experiment with a
step size of 5 seconds, but the created file was the same size. But I
can live with largish files.
However, creating the databases on the Raspberry Pi was a dismal
failure. Memory usage climbed until the process crashed. I assume the
out-of-memory process killer did it. Since the Raspberry Pi only has
about 384MB of memory free after booting up, it is fairly memory
constrained. Is rrdtool creating the entire file in memory before
writing it out to disk?
So I copied some rrd files from the PC to the Raspberry Pi. I wasn't
surprised that they didn't work - presumably an endianness issue.
However, the rrd graph generation code was able to load up and complain
about badness in the file, so there's at least enough RAM to load up
I wonder if I am going about this the wrong way? In the last hour I've
seen references to people having hundreds of rrd files. Is it
recommended to split your data up into lots of small chunks? Are there
other recommendations about how to use rrdtool in the most
More information about the rrd-users