Dask array supports these operations by creating a new array where each
block is slightly expanded by the borders of its neighbors. This costs an
excess copy and the communication of many small chunks but allows localized
functions to evaluate in an embarrassing manner. We call this process
ghosting.

While in this case we used a SciPy function above this could have been any
arbitrary function. This is a good interaction point with Numba.

If your function does not preserve the shape of the block then you will need to
provide a chunks keyword argument. If your block sizes are regular then
this can be a blockshape, such as (1000,1000) or if your blocks are irregular
then this must be a full chunks tuple, for example ((1000,700,1000),(200,300)).

>>> g.map_blocks(myfunc,chunks=(5,5))

If your function needs to know the location of the block on which it operates
you can give your function a keyword argument block_id

deffunc(block,block_id=None):...

This extra keyword argument will be given a tuple that provides the block
location like (0,0) for the upper right block or (0,1) for the block
just to the right of that block.

After mapping a blocked function you may want to trim off the borders from each
block by the same amount by which they were expanded. The function
trim_internal is useful here and takes the same depth argument
given to ghost.