7

I managed to generate a PNG image with a fractal.

The image is 65,536 pixels high and 65,536 pixels wide. It’s too big to be decoded into memory and displayed. It has probably a lot of unused space near the borders and I want to trim it to significantly reduce its area.

I tried GIMP and GraphicsMagick, but GIMP froze my computer and GraphicsMagick failed to allocate enough memory for the image.

I use Linux and I have 16GB of RAM. The compressed image is 6.2MB in size.

Can I trim the borders without loading the image fully into memory?

FWIW, in the end I used my school's server but I still want to know the answer.

Giacomo1968
  • 58,727
matj1
  • 199

2 Answers2

0

You already know the cartesian definition of the image.

Consider some Python code to read the file in reasonably-sized buffer-loads - choose one or multiple lines of image pixels per buffer, but sufficiently few with regard to memory-availability.

On most pixel-lines, the left-most and right-most "border" pixels can be discarded. The first few, and last few, "border" lines, similarly.

The buffer-size of the output file can, and will, be different to that of the input file, by definition.

d-n
  • 1
0

ImageMagick's "crop" filter should be able to do this efficiently.

https://www.imagemagick.org/Usage/crop/

What you want to achieve is definitely possible in theory. It's just a matter of finding an application that actually does it like this.

Bear in mind that an image that is 65536 x 65536 with 24 bit colour will consume about 12Gb of RAM - so actually it will fit in memory.

If it's 32-bit colour it would be about 17Gb, but even then you can load the whole image into RAM - Linux will overflow memory into the sawp file. This is slow, and the performance won't be very good, but it won't actually fail - it will just succeed slowly, unless you run out of swap space.

Ben XO
  • 334