I have a program that I've written on my Mac but won't run on my Raspberry Pi due to running out of RAM (MemoryError).
The essence of the program is some image processing, where it will convolve a 640x480 uint8 with a complex128 of twice the size.
I figure the memory usage is: Initial image:
640 x 480 x 8 bits / 8 bits / 1024 bytes = 300 kb
Complex matrix:
640 x 480 x 2^2 x 128 bits / 8 bits / 1024^2 = 18.75 MB
Let's assume it has to hold perhaps two or three copies of these various matrices in memory - that should be a fairly small footprint - perhaps < 100 MB. Unfortunately it seems to be exhausting the full 330MB available (the Python runtime must load into this space as well).
- Is my analysis correct?
- Any tips on how to manage memory a bit better in Python?
UPDATE:
As suggested below, I've done some memory profiling, it is indeed the fftconvolve the spikes the RAM usage, as follows:
Line # Mem usage Increment Line Contents
65 86.121 MiB 0.000 MiB @profile
66 def iriscode(self):
67 86.121 MiB 0.000 MiB img = self.polar
68
69 86.379 MiB 0.258 MiB pupil_curve = find_max(img[0:40])
70 86.379 MiB 0.000 MiB blur = cv2.GaussianBlur(self.polar, (9, 9), 0)
71 76.137 MiB -10.242 MiB iris_fft = fit_iris_fft(radial_diff(blur[50:,:])) + 50
72
73 76.160 MiB 0.023 MiB img = warp(img, iris_fft, pupil_curve)
74 # cv2.imshow("mask",np.uint8(ma.getmaskarray(img))*255)
75
76 global GABOR_FILTER
77 262.898 MiB 186.738 MiB output = signal.fftconvolve(GABOR_FILTER, img, mode="valid")
Still, the magnitude of this increase surprises me. Any ideas what I can do to reduce it? I tried using complex64 instead of complex128 but the memory usage was the same.
scipy.signal.convolve2d? I'm not aware of a numpy 2d convolution function.