My current problem involves having an exact (symbolic) inverse of a scaled AR(1) matrix for n-dimension. (I don't know what this matrix would be called in general; I'm sure it is used often.) This is used as a smoothing prior on a function sampled on a uniform grid. For a 1-dimensional function, the matrix is

(This can be also found in Kac, M., Murdock, W., and Szegö, G. (1953). On the eigenvalues of certain Hermitian forms. J. Rational Mech. Anal, 2:767–800.)

I would imagine that this can be generalized to higher dimensional case where the $\alpha$ now spreads in each direction. This would allow my uniformly sampled n-dimensional function to be smooth. Sort of having the form
\begin{equation}
C_{(i,j),(k,l)} = \rho \alpha_x^{|i-k|} \alpha_y^{|j-l|},
\end{equation}
But as a giant matrix for flattened function (the vec-operation; representing the n-dimensional function as a vector with some ordering). Can anybody recommend a book on such symbolic matrix inversions that would have this?