We considered signal reconstruction with redundant expansions under distributed processing in noisy environments. Redundant expansions have the ability to reduce noise corrupting the coefficients, but distributed processing ...

Sensorineural systems often use groups of redundant neurons to represent stimulus information both during transduction and population coding of features. This redundancy makes the system more robust to corruption in the ...

Information processing is performed when a system preserves aspects of the input related to what the input represents while it removes other aspects. To describe a system's information processing capability, input and ...

Basic probability theory, statistical signal processing and information theory, and inter-relationships among these disciplines form the foundations of a theory of information processing. Examples are drawn from point-process ...

Data broadcasting is potentially an effective and efficient way to share information in wireless sensor networks. Broadcasts offer energy savings over multiple, directed transmissions, and they provide a vehicle to exploit ...

The relations between information theory and neural coding are discussed by two researchers, one knowledgeable in information theory, the other in neuroscience. The classic information-theoretic notions of entropy, mutual ...

Traditional introductory courses in electrical engineering are typically circuit theory courses, which may include both analog and digital hardware and possibly software. The alternatives have focused on how to teach (using ...

The equivalent circuit concept derives from the Superposition Principle and Ohm's Law. Two forms of the equivalent circuit, the Thevenin equivalent and the Norton equivalent, distill any linear circuit into a source and ...

Mutual information enjoys wide use in the computational neuroscience community for analyzing spiking neural systems. Its direct calculation is difficult because estimating the joint stimulus-response distribution requires ...

Mutual information between stimulus and response has been advocated as an information theoretic measure of a neural system's capability to process information. Once calculated, the result is a single number that supposedly ...

A method of improving the bearing-resolving capabilities of a passive array is discussed. This method is an adaptive beamforming method, having many similarities to the minimum energy approach. The evaluation of energy ...

We apply the recent theory of information processing to a hybrid distributed detection architecture that combines the traditional parallel and tandem architectures. Central to this theory is the Kullback-Leibler discrimination ...

We analyzed sustaining fiber responses in the crayfish visual system to light pulses using information processing techniques. The light pulse stimuli elicited a transient and a steady-state component in the EPSP input and ...

This paper develops a systematic method of studying the benefits of soft decoding for linear block codes by applying the concepts of information processing. We show that soft decoding uniformly improves decoder performance ...

To understand whether the population response expresses information better than the aggregate of the individual responses, the sum of the individual contributions is frequently used as a baseline against which to assess ...

Wireless sensor networks are often studied with the
goal of removing information from the network as efficiently as
possible. However, when the application also includes an actuator
network, it is advantageous to determine ...

Researchers studying neural coding have speculated that populations of neurons would more effectively represent the stimulus if the neurons "cooperated:" by interacting through lateral connections, the neurons would process ...

This paper develops a new systematic method of studying the benefits of 2-bit soft decisions by applying the concepts of information processing theory. We quantify performance in terms of the information transfer ratio and ...

We develop two new multivariate statistical dependence measures. The first, based on the Kullback-Leibler distance, results in a single value that indicates the general level of dependence among the random variables. The ...

We consider the problem of digital communication in a Rayleigh flat fading environment using a multiple antenna system, when both the transmitter and the receiver are unaware of the channel coefficients. Using Stein's lemma ...

In this paper, the problem of optimally communicating analog sources using a bandwidth and power limited digital system is considered. We propose and analyze optimal combined source-channel coding schemes that jointly ...

We create a framework based on Fisher information for determining the most effective population coding scheme for representing a continuous-valued stimulus attribute over its entire range. Using this scheme, we derive ...

We study the optimization of a binary decision system where quantized (soft) decisions are transmitted across an additive white Gaussian noise channel. We adjust the bit transmission intervals to maximize the Chernoff ...

When transmitting a sampled signal digitally, data and error correction bits must be transmitted at least as fast as the sampling rate. Typically, each bit is allocated the same transmission time interval, which means the ...

The equivalent circuit concept derives from the Superposition Principle and Ohm's Law. Two forms of the equivalent circuit, the Thevenin equivalent and the Norton equivalent, distill any linear circuit into a source and ...

The textbook has traditionally been the fundamental tool of university teaching. The text both serves as the repository of facts and information and provides the recommended structure and sequence for teaching and learning ...

The SPIB (Signal Processing Information Base) project at Rice University is discussed. This information base will provide the signal processing researcher and the applications engineer with data, programs, and papers that ...

We define a new distance measure - the resistor-average distance - between two probability distributions that is closely related to the Kullback-Leibler distance. While the Kullback-Leibler distance is asymmetric in the ...

Information processing theory endeavors to quantify how well signals encode information and how well systems, by acting on signals, process information. We use information-theoretic distance measures, the Kullback-Leibler ...

During the stationary portion of neuron's spiking response to a stimulus, the stimulus could be coded in the average rate and, more elaborately, in the statistics of the sequence of interspike intervals. We use information ...