John Watkinson's literature of the the mid 1980s gives examples of 'typical' anti-aliasing filters:
a nine-pole elliptic passive, which is -60dB down at 22kHz, and a thirteen-pole which is down -80dB. Group delay was a problem for such high order filters which needed compensation for exacting applications.

The Chesky Test CD contains tracks recorded with 128-times oversampling, and "standard" tracks for comparison. So I assume there is no standard for oversampling (and anti-aliasing) at the recording end.

When designing a long-term system architecture standardising the bare minimum is the the best option. The CD acts as the interface between the recorder and player. By defining the interface it leaves plenty of scope for improvements in technology. This is how railways, roads, the telephone system, the internet work - define the interface, leave the rest flexible.

Sometimes people are tempted to make the interface definition flexible, with scope for future changes. My experience in IT is that this almost always fails: it creates complications, and it nearly always turns out that you soon want a change which you haven't allowed for. Examples of poor interface design: almost any Microsoft file format, which is why they create such trouble with intergenerational compatibility.

Sony used brick filter in the first product then they quicky move to OS. Is that a sign that analog brick filtering was working? Or it was just a sign that the technology wasn't capable of making fast enough DAC's to cope with an OS signal? Sony needed to have something out quick, in competition with Philips.
To process OS you need faster settling times on DAC stage. With low distortion. That needed some time to develop.

Or... why do you think they moved as fast as they did to 2x, 4x, 8x, 16x??? A conspiration to make bad sound?

Fascinating chapter of industrial history. Obviously consortia contain contradictions, with each party looking to take up a differentiated position on the same bandwagon.

Sony's insistance on 44.1kHz and 16 bits was a bit last-minute. That suggests to me that there can have been no agreed exact standard for filtering during the recording process.

Perhaps Sony had less to gain from oversampling because their DACs needed a downstream audio switching stage that required stringent filtering anyway. Also, the PWM current output was timed by a clock running at over 512 times the sample rate, which was a limiting feature. In their players using the TDA1541 I think they followed Philips' practice.

Who made the machines used in the recording process? I would guess that de facto standards evolved in the industry.