Abstract

In this paper, distributed regression estimation problem with incomplete
data in a time-varying multi-agent network isinvestigated. Regression
estimation is carried out based on local agentinformation with incomplete
in the non-ignorable mechanism. By virtue of gradient-based design
and adaptive filter,a distributed algorithm is proposed to deal with
aregression estimation problem with incomplete data. With the help
ofconvex analysis and stochastic approximation techniques, the exactconvergence
is obtained for the proposed algorithm with incomplete dataand a jointly-connected
multi-agent topology. Moreover, online regretanalysis is also given
for real-time learning. Then, simulationsfor the proposed algorithm
are also given to demonstrate how it cansolve the estimation problem
in a distributed way, even when thenetwork configuration is time-varying.

Therefore, for all $\epsilon>0$, there exists a $k_{1}$ ($k_{1}$ is an integer) for $k>~k_{1}$, such that $\|R^{\bar{A},i}_k-R^{\bar{A},i}\|<\epsilon$. Define $M_{1}=\max\{\|R^{\bar{A},i}_{1}-R^{\bar{A},i}\|,\|R^{\bar{A},i}_{2}-R^{\bar{A},i}\|,\ldots,\|R^{\bar{A},i}_{k_{1}}-R^{\bar{A},i}\|,\epsilon\}$. Then $\|R^{\bar{A},i}_k-R^{\bar{A},i}\|\leqslant~M_{1}$ for $k\geqslant~0$. Analogously, $\|r^{y\bar{A},i}_{k}-R^{\bar{A},i}\|\leqslant~M_{2}$ for $k\geqslant~0$. From Remark Rem3, we have $\|\xi^{i}_{k}\|<C_{x}$. Hence, $\mathbb{E}\|\epsilon_i(k)\|\leqslant~M_1C_x+M_2=M_{\epsilon},~\forall k\geqslant~0$. By (14), $d^{i}_{k}=\nabla~g_i(k)+\epsilon_i(k)$. Thus, $\mathbb{E}\|d^{i}_{k}\|\leqslant~\mathbb{E}\|\nabla~g_i(k)\|+\mathbb{E}\|\epsilon_i(k)\|\leqslant~C_g+M_{\epsilon}=M_d$, which is bounded.

Proof of Lemma Lm4

For all $i\in~\mathcal{N},\;k\geqslant~0$, define $p^{i}_{k+1}=\xi^{i}_{k+1}-\sum_{j=1}^{N}w_{ij}(k)\xi^j_{k}$. We rewrite (8) compactly in terms of $\Psi(k,s)$ as follows: $\xi^{i}_{k+1}=\sum_{j=1}^{N}~[\Psi(k,0)]_{ij}\xi^{j}_{0}+p^{i}_{k+1}+\sum_{s=1}^{k}\sum_{j=1}^{N}~[\Psi(k,s)]_{ij}p^j_{s}$, for $k\geqslant~s$. Moreover, with Assumption Ass1 and by induction, the following equality holds: $\bar{\xi}_{k+1}=\frac{1}{N}\sum_{i=1}^{N}\xi^{i}_{0}+\frac{1}{N}\sum_{s=1}^{k+1}\sum_{j=1}^{N}p^j_{s}$. Consequently, we obtain that, for $i\in~\mathcal{N}$, $\xi^{i}_{k+1}-\bar{\xi}_{k+1}=\sum_{j=1}^{N}( [\Psi(k,0)]_{ij}-\frac{1}{N})\xi^{j}_{0}+(p^{i}_{k+1} -\frac{1}{N}\sum_{j=1}^{N}p^j_{k+1})+\sum_{s=1}^{k}\sum_{j=1}^{N} (~[\Psi(k,0)]_{ij}-\frac{1}{N})p^j_{s}$. Therefore, $\forall~i\in~\mathcal{N}$,

From Theorem Thm1, $\|\xi^i_{k+1}-\bar{\xi}_{k+1}\|$ converges in mean. Then, on the base of Fatou's Lemma 1), the following relation holds $0\leqslant\mathbb{E}[\underset{k\rightarrow\infty}{\liminf}\|\xi^i_{k+1}-\bar{\xi}_{k+1}\|]\leqslant\underset{k\rightarrow\infty}{\liminf}\mathbb{E}[\|\xi^i_{k+1} -\bar{\xi}_{k+1}\|]=0$, which yields $\mathbb{E}[\underset{k\rightarrow\infty}{\liminf}\|\xi^i_{k+1}-\bar{\xi}_{k+1}\|]=0$. Therefore,$\underset{k\rightarrow\infty}{\liminf}\|\xi^i_{k+1}-\bar{\xi}_{k+1}\|=0$ holds almost surely. Since $~\|\xi^i_{k+1}-\bar{\xi}_{k}\|^2\leqslant\|\hat{\xi}^{i}_{k+1}-\bar{\xi}_{k}\|^2$,

According to Theorem 6.2 of [12], $\sum_{k=1}^{\infty}\iota_{k}\|\xi^j_{k}-\bar{\xi}_{k}\|<\infty$ with probability $1$. Therefore, together with $\sum_{k=1}^{\infty}N\iota^2_kM_{d}^2<\infty$, $\|\xi^i_{k+1}-\bar{\xi}_{k}\|^2$ converges almost surely by Lemma Lm2. Hence, the conclusion follows.

Clearly, $\|\xi^i_{k+1}-\xi^*\|^2\leqslant~\|\hat{\xi}^{i}_{k+1}-\xi^{*}\|^2$, and then $\|\xi^i_{k+1}-\xi^*\|^2\leqslant~\|\sum_{j=1}^{N}w_{ij}(k)\xi^{i}_{k}-\xi^*~\|^2~+ \iota^2_k \|d^{i}_{k}\|^2-2\iota_{k}(d^{i}_{k})^\text{T}~(\sum_{j=1}^{N}w_{ij}(k)\xi^{i}_{k}-\xi^*)$, which follows from [21] that, $\forall~x_{1},x_{2}$, $g(x_{2})\geqslant~g(x_{1})+\nabla~g(x_{1})^\text{T}(x_{2}-x_{1})$. Recalling that, $\mathbb{E}\|d^{i}_{k}\|\leqslant~M_{d}$ in Lemma Lm3 and $\|\nabla g^{i}(\xi)\|\leqslant~C_{g}$ in Remark Rem5, we have

and $\mathbb{E}[\epsilon_{i}^\text{T}(k)(\sum_{j=1}^{N}w_{ij}(k)\xi^{i}_{k}-\xi^*)]\leqslant \mathbb{E}\|\epsilon^{i}_k\|\|\sum_{j=1}^{N}w_{ij}(k)\xi^{i}_{k}-\xi^*\|$ for all $k=0,1,2,\ldots$. Therefore,

From Lemma Lm2, the sequence $\sum_{i=1}^{N}~\|\xi^i_{k}-\xi^*\|^2$ converges with probability 1 and $\sum_{k=1}^{\infty}v_{k}<\infty$.

As for $v_{k}$, according to the boundedness of $\xi^{i}_{k}$ and the ergodicity of $\bar{A}^{i}_{k}$, we conclude that $\lim_{k\rightarrow\infty}R^{\bar{A},i}_k\xi^{i}_{k}-R^{\bar{A},i}\xi^{i}_{k}=0$. Moreover, $\lim_{k\rightarrow\infty}r^{y\bar{A},i}-y^{i}_{k}\bar{A}^{i}_{k}=0$ by the stationary property of $y^{i}_{k},~a^{i}_{k}$.

Similar to the demonstration of Theorem 6.2 in [12], we get $\sum_{k=1}^{\infty}2\sum_{i=1}^N\iota_{k}\mathbb{E}[\|\epsilon^{i}_k\|]\|\sum_{j=1}^{N}w_{ij}(k)\xi^{i}_{k}-\xi^*\|<\infty$, which implies $\sum_{k=1}^{\infty}2\iota_{k}(g(\bar{\xi}_{k})-g(\xi^*))<\infty$. Since $\sum_{k=1}^{\infty}2\iota_{k}(g(\bar{\xi}_{k})-g(\xi^*))<\infty$ and $\sum_{i=1}^{\infty}\iota_{k}=\infty$, $\liminf_{k\rightarrow \infty}~g(\bar{\xi}_{k})=g(\xi^*)$ holds almost surely. Therefore, $\lim_{k\rightarrow~\infty}\|\xi^i_{k}-\bar{\xi}_{k}~\|=0$ holds almost surely for all $i$, which yields the conclusion.

Proof of Lemma Lm5

Define $r^{i}_{k}=\xi^{i}_{k}-\hat{\xi}^{i}_{k}=P_{X}(\hat{\xi}^{i}_{k})-\hat{\xi}^{i}_{k}$. Since $X$ is convex and $W(k)$ is doubly stochastic, we have $\sum_{j=1}^{n}w_{ij}(k)x^{j}_k\in~X$, which leads to $\|r^{i}_{k+1}\||\leqslant~\|P_{X}(\hat{\xi}^{i}_{k+1})-\sum_{j=1}^{n}w_{ij}(k)\xi^{j}_{k}\|+\iota_{k}\|d^{i}_{k}\|\leqslant~2\iota_{k}\|d^{i}_{k}\|$. By Algorithm 1, we obtain $\bar{\xi}_{k+1}=\bar{\xi}_{k}-\frac{\iota_{k}}{N}\sum_{i=1}^{N}(\triangledown g^{i}_k+\epsilon^{i}_k)+\frac{1}{N}\sum_{i=1}^{N}r^{i}_{k+1}$. As a result, we can decompose $\|\bar{\xi}_{k+1}-\xi\|^{2}$ by

Since $ \langle~\triangledown~\bar{g}^{i}_{k},\bar{\xi}_{k}-\xi^{i}_{k}\rangle\leqslant~\|\triangledown~\bar{g}^{i}_{k}\|\|\bar{\xi}_{k}-\xi^{i}_{k}\|$ and $\|\xi^{i}_{k}-\xi\|^{2}+\|\xi^{i}_{k}-\bar{\xi}_{k}\|^{2}~\geqslant~\frac{1}{2}\|\bar{\xi}_{k}-\xi\|^{2}$, we can estimate $-\langle~\triangledown~g^{i}_{k},\bar{\xi}_{k}-\xi\rangle$ as follows: $-\langle~\triangledown~g^{i}_{k},\bar{\xi}_{k}-\xi\rangle\leqslant~(\|\triangledown~g^{i}_{k}\|+\|\triangledown~\bar{g}^{i}_{k}\|)\|\bar{\xi}_{k}-\xi^{i}_{k}\| +g^{i}(\xi)-g^{i}(\bar{\xi}_{k})-\frac{\mu}{4}\|\bar{\xi}_{k}-\xi\|^{2}$.

[7]
Kokaram
A C.
On Missing Data Treatment for Degraded Video and Film Archives: A Survey and a New Bayesian Approach.
IEEE Trans Image Process,
2004, 13: 397-415
CrossRefADSGoogle Scholarhttp://scholar.google.com/scholar_lookup?title=On Missing Data Treatment for Degraded Video and Film Archives: A Survey and a New Bayesian Approach&author=Kokaram A C&publication_year=2004&journal=IEEE Trans Image Process&volume=13&pages=397-415