RRD Error Code 1 RRD Created on Other Architecture

  • Hello all,

    I'm getting the following error….

    php: /status_rrd_graph_img.php: Failed to create graph with error code 1, the error is: ERROR: This RRD was created on other architecture/usr/bin/nice -n20 /usr/local/bin/rrdtool graph /tmp/system-processor.rrd-month.png –start 1311020973 --end 1313785773 --vertical-label "utilization, number" --color SHADEA#eeeeee --color SHADEB#eeeeee --title "hostname - System :: Processor - 1 month - 1 hour average" --height 200 --width 620 DEF:"user=/var/db/rrd/system-processor.rrd:user:AVERAGE" DEF:"nice=/var/db/rrd/system-processor.rrd:nice:AVERAGE" DEF:"system=/var/db/rrd/system-processor.rrd:system:AVERAGE" DEF:"interrupt=/var/db/rrd/system-processor.rrd:interrupt:AVERAGE" DEF:"processes=/var/db/rrd/system-processor.rrd:processes:AVERAGE" AREA:"user#990000:user" AREA:"nice#a83c3c:nice:STACK" AREA:"system#b36666:system:STACK" AREA:"interrupt#bd9090:interrupt:STACK" LINE2:"processes#cccccc:processes" COMMENT:"\n" COMMENT:" minimum average maximum current\n" COMMENT:"Us

    I have PFSense running on an AMD quad core processor (possibly quad maybe 6… either way) I accidentally downloaded and installed the i386. However, shouldn't it still be able to generate RRD graphs even though I installed i386 on an 64-bit capable processor?? When I go to the RRD Graph page it tells me in red letters on the image where the graph would be that there's an error generating graphs and to look at system logs.

    I'm going to be reloading it with x64 in a bit but it could be a couple weeks until I get backup hardware in & setup so that way the users won't see the downtime. But I would like this info until then.

    Thanks again!

  • I believe that the RRD format is different between the architectures, and isn't interchangeable.


  • It is different and not interchangeable, you'll either have no data until you're back on amd64, or you'll have to delete all your data and have it recreate it (and do the same once you're back on amd64).

Log in to reply