So I got my self a 6404 and I am trying to do some stuff with it using LabView as a interface.

I got the data from the oscilloscope no problem using the block subvi and it looks good, all but on regard. The voltage that is shown by the labview graph is half the voltage that I am using in my generator.
I am running a 5V square wave and when I use the picoscope software I am getting the correct readout of 5V but when I use the LabView with the conversion from ADC to analog I am getting only about 2.5V (it is (raw ADC/32152)*Range) and it isn't the conversion issue as on the ADC I am getting only ~17k

This would suggest that you have the wrong range set in Labview. What range are you using ? and what range in PicoScope ?

There can only be one conversion formula, based on maximum ADC count and range.

Regarding both things.

As I was checking on the Picoscope software I was using auto and then added a measurment for max. This gave me about 5V.

For the LabView. I am getting a rather strange behavior, for example. I have channel range set to 200mV then I see a saturation and the reading is not reliable. When I go above that value then the signal doesn't saturates but stays at 0.5 amplitude. No matter what the range.
From what I understand in my case the signal amplitude is quantizated by the 16 bit (which is confirmed by the raw ADC graph which swings between 32152 and -32152) NOT the range that is given. Therefore this is rather not robust unless we know what the max amplitude of the signal is.

The scope will return ADC counts in the range 32152 and -32152 regardless of the chosen voltage range.

If you have set +/-200mV for your range then 32152 = 200mV and -32152 = -200mV, if you have set +/-2V then 32152 = 2V and -32152 = -2V. Hence the equation that uses the range and the maximum count to work out the voltage value from an individual ADC count.

As you can see from this we use the top 8 bits of a 16 bit value to represent the ADC count for an 8 bit scope, top 12 bits for a 12 bit scope and all 16 bits for our 16 bit scopes.

Our own software can auto range by stepping up if the value is greater than the maximum, or stepping down if the value would be less than the maximum value for the next range down.

I understand what is happening and it make a lot of sense, but I still see two problems:

1. It sitll doesn't explain why my amplitude is halved in LabView.

2. Even though my signal is 5V then theoretically it should saturate all the way untill the range is 5V, while in my case it stops to sarturate at 500mV which gives a false reading and appearance that the signal is 250mV not 2.5V.
´

Well if the conversion range for the mulitplication should be in mV then I had it wrong. But it doesn't really matter after I converted it to the mV its still doesn't solve any of my problems, neither the amplitude issue nor the saturation at level issues. I made the scheme that I am using into a subvi and I attached it.

Still the main issue here is the half of the amplitude. The signals that I will be measuring will stay in rather predictible range and I will be interested in measuring timings not necessarily the levels and I could get out with using the raw adc range but using volts would be helpful.

Looking at the block diagram you originally posted there is a broken link between Channel A Settings and the PS6000 settings sub Vi block. I don't think that you are correctly setting up the channel which is why the readings are odd and you get saturation.