Hi all, I plan to use the USRP to run a feedback controller. The
basic idea is to read in a feedback signal over three input channels
(two LFRX daughter boards), process them to generate control signals
that are output over three DACs (using two LFTX daughter boards). All
of the processing will take place in the FPGA and I don't plan to use
the decimation and demodulation/modulation stages that are part of
the default GNURadio FPGA code.
However to begin with, I would like to measure the latency of the
USRP. To do this, I read a single input channel using the LFRX
daughter board and output the same value over the LFTX daughter
board.
The FPGA code itself is very straightforward in this first pass. I
started using one of the top level FPGA code images and cut out the
parts I wouldn't use. My code simply scales the input to 14 bits by
bit shifting by 2 and writes it to the output. Here's the relevant
code fragment:
always @(posedge clk64) adc0<= #1 {rx_a_a, 2'b0};
assign tx_a = adc0;

With the default settings, the DAC is accepting data at 32 MS/s
interleaved. You can change that with settings for the AD9862.

I have setup the ADC and DAC using the python call
_write_9862(which_codec, regno, value) with the following values: RX
Side: Rx mode: Single channel ADC signal Data Type:
unsigned ints
(2s complement turned off) PGA Gain: minimum
TX Side: Tx mode: Single channel DAC data Data Type:
unsigned ints
(2s complement turned off) PGA Gain: maximum
The problem I see is that the output from the FPGA has a phase lag of
180 degrees in relation to the input. As far as I can tell, the
ADC/DAC and the FPGA are setup correctly and my function generator
and scopes are set to 50ohm impedance.
Has anyone seen this before?

Have you disabled the TX coarse and fine mixers in the AD9862? Even if
you set their frequency to 0, they will still have a fixed rotation
unless you disable them.