I use matplotlib.pyplot.imsave with argument cmap='gray' to save a 1024x1024 nparrayas a grayscale image, but when I then read the saved image using matplotlib.pyplot.imread, I get a 1024x1024x4 nparray. Why is this?
Here is the code:
import numpy as np
import matplotlib.pyplot as plt
im = np.random.rand(1024, 1024)
print(im.shape)
plt.imsave('test.png', im, cmap='gray')
im = plt.imread('test.png')
print(im.shape)
The documentation for imread states that "The returned array has shape
(M, N) for grayscale images." I suppose this raises the question of what exactly is meant by a grayscale image? How are they stored on disk, and how is Matplotlib supposed to know whether to read an image as grayscale, RGB, RGBA, etc. (and why is it being read as an RGBA image in this case)?