We have studied theory of stochastic processes and information theory, and obtained following results.(1) We investigate the large deviation theorems and their applications. First, we consider a signal detection problem in a continuous time white Gaussian channel. Suppose that one of two stationary signals is transmitted over the channel. Based on the observation of the output, we are required to decide which signal is sent. This is a kind of hypothesis testing problem. Using a large deviation theorem, we have shown that the error probability of the detection goes to zero exponentially fast.(2) It is known that the maximum entropy method is a method to estimate a spectral density function of a stationary proces. Using a large deviation theorem, we prove a limit theorem which shows that the maximum entropy method is optimal if the sample size is large enough.(3) Channel capacity, mutual information, and mean squre error play fundamental roles in information theory. We prove some inequalities for these quantities. These inequalities will play important roles in information theory. One of them is as follows. Let C be the feedback capacity of a communication channel with an additive noize Z,and denote by C^* the feedback capacity of the channel with Gaussian noize Z^* having the same covariance as Z.Then it is true that C^* < C < C^* + H (Z ; Z^*), where H (Z ; Z^*) denotes the relative entropy of Z with respect to Z^*.