Hi all,
Is there a way in numpy to associate a (large) matrix with a disk
file, then and tile and index it, then cache it as you process the
various pieces? This is pretty important with massive image files,
which can't fit into working memory, but in which (for example) you
might be doing a convolution on a 100 x 100 pixel window on a small
subset of the image.
I know that caching algorithms are (1) complicated and (2) never
general. But there you go.
Perhaps I can't find it, perhaps it would be a good project for the
future? If HDF or something does this already, could someone point me
in the right direction?
Thx