[rrd-users] rrd_restore memory consumption (Re: rrdtool backed mrtg using massive amounts of memory)
tobi at oetiker.ch
Wed May 20 06:25:10 CEST 2009
the problem is, that in 1.3 we are using libxml to read in the
complete file. The same happens in 1.2, but there we use our own
code todo it ...
In order to fix this behaviour, I have to rewrite rrd_restore
entirely to use an incremental parsing aproach (xmlread) ...
for now, your best bet with large rrd files is to use 1.2
Yesterday Carson Gaspar wrote:
> Tobias Oetiker wrote:
> > Yesterday Carson Gaspar wrote:
> > > Tobias Oetiker wrote:
> > >
> > > > the restore issues may come from the switch to libxml for reading the
> > > > rrd dump files ... does this memory issue occure with any file you
> > > > try to restore or only with realy big ones ?
> > > The memory bloat (compared to 1.2.x) is for all files, but the degree
> > > varies
> > > by input file. I haven't determined what the difference is, but the ones
> > > that
> > > completely exhaust VM are definitely among the ones with the most samples.
> > what size are you talking about ?
> One failing example:
> -rw-r--r-- 1 gaspac fir1 966M May 19 14:28 0.xml
> 1.2.x uses 1,100,399K (according to "timex -p -m")
> 1.3.8 spews thousands of errors:
> 0.xml:81729: error: xmlSAX2Characters: out of memory
> v><v> 0.0000000000e+00 </v><v> 0.0000000000e+00 </v><v> 0.0000000000e+00
> A 32-bit binary can't deal at all. I tried building a 64-bit binary and (as I
> recall) it got to over 15GB of VM before dying. (I'd give more exact numbers,
> but rebuilding the huge numbers of rrdtool dependencies 64-bit takes ages).
Tobi Oetiker, OETIKER+PARTNER AG, Aarweg 15 CH-4600 Olten, Switzerland
http://it.oetiker.ch tobi at oetiker.ch ++41 62 775 9902 / sb: -9900
More information about the rrd-users