I require an m-dimensional np.ndarray lattice structure, denoted by arr, with the following properties where m and n are constants (e.g. m=3,n=50):
arr.shape == (n, n, n, ..., n)wheren in range(100)len(arr.shape) == mwherem in range(4)- so up to
100,000,000lattice points
Is it better to store this as a 1D array and overload __getitem__ and __setitem__ or is numpy optimised in terms of memory storage for large arrays?