$\mathbf{y} = (y_1,\ldots,y_N)$ is the vector of binomial observed values,

$\mathbf{\pi} = (\pi_1,\ldots,\pi_N)$ is the vector of true probabilities,

$\mathbf{\hat{p}} = (\hat{p}_1,\ldots,\hat{p}_n)$ is the vector of fitted proportions, based on the logistic regression model, and

$\mathbf{N} = diag(n_1,\ldots,n_N)$ (the number of observations at each explanatory variable pattern/combination). Then, the "fitted values" can be referred to as $\mathbf{N \hat{p}} $.

Then, what I am interested in is
$$ Var(\mathbf{y} - N \mathbf{\hat{p}}) = Var(\mathbf{y}) + N \cdot Var(\mathbf{\hat{p}}) \cdot N' - Cov(\mathbf{y},\mathbf{\hat{p}})\cdot N'$$
That is, I am interested in the covariance matrix of the vector of "response" residuals from a logistic regression model. The name "response" comes from the similar name that is given when extracting residuals from a GLM in R.

For example, it is pretty easy to get
$$ \widehat{Var}(\mathbf{y}) = diag(n_1\hat{p}_1(1-\hat{p}_1),\ldots,n_N\hat{p}_N(1-\hat{p}_N))$$
Since the observations are binomial and independent.

Also, $Var(\mathbf{\hat{p}})$ is pretty straightforward, since $\widehat{Var}(\mathbf{\hat{\beta}}) = (X'VX)^{-1}$, (from the Fisher Information), where $V = \widehat{Var}(\mathbf{y})$, and then I can use the Delta Method to obtain an approximation to $Var(\mathbf{\hat{p}})$.

Really, what I can't get at is $Cov(\mathbf{y},\mathbf{\hat{p}})$, to finish the problem.

Maybe there is another way around all of this with a clever trick in another direction?