Yes when I refer to an interlaced display I include flat panels that accept 1080i and there's a lot of those.

I am pushing this because it potentially avoids a lot of problems associated with watching TV material (XBMC-PVR) and camcorder home movies which generally are interlaced. The right hardware combination to support this is woefully limited. For example all nVidia ION based systems cannot perform combined IVTC and temporal-spatial deinterlacing for 1080i video at the required frame rate. AMD E350 boards running Linux cannot do ANY VA deinterlacing because it is not supported by VAAPI (actually this applies to all AMD graphics cards).

Currently the only adequate hardware setup for Linux and XBMC is one containing an nVidia graphics system with enough grunt to perform IVTC and temporal-spatial deinterlacing at 1080i 60fps. Since the only graphics systems capable of this are PCI cards, the motherboard must have a PCI express 16 slot to. That means a large form factor PC case. These cards are also power hungry.

Did you solve the issue with scaling interlaced material or are you relying on output in native resolution? Native resolution would be pretty annoying because I haven't see a single TV yet where you could disable overscan for SD content (only for HD content) - so you'll lose the edges of the picture when outputting SD material in native resolution.

No I did not solve the issue with scaling interlaced material. It has to be IVTC'd and deinterlaced before scaling.

You are right to be concerned about SD content. In the most basic hardware setups displaying SD at native resolution would be desirable but I suspect this is very tricky. For HDMI connected displays 576i and 480i modes require pixel doubling to avoid the lower pixel clock limit specified for HDMI. This means both the video and all of the GUI would need to be rendered to a frame 720x576 pixels for 576i (600x480 for 480i) and then stretched to the full 1440x576i. In OpenGL you would texture map with the setting glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST) to double pixels horizontally.

Also mode switching the TV display and the low res GUI might be annoying. I suspect a halfway-house approach might be more suitable where if you say had an ION system it is quite capable of IVTC and deinterlace of SD material so you could then display this on a 1080i display. For 1080i material the deinterlacer would be turned off and the interlacer turned on.

Since there is still very much content being published in interlaced form (especially SD and HD TV in the Netherlands), the problem of deinterlacing is among the last of problems I am still trying to tackle.

My nvidia ION system certainly does not have the power to properly de-interlace 1080i content and even an rather powerful PCI-E card in my upstairs PC (430 GT) is not always up to the job.

Being able to send the output to my TV in it's original interlaced form (just as the set top box of my TV provider does) would be awesome!

This would be of HUGE interest to me and I know a lot of others. Many people with HT setups have Pre/Pros or even receivers (in addition to those that feed directly to their TV's) with onboard chips that handle scaling and deinterlacing duties far more effectively than XBMC can with software (or even hardware assist). To me, this goes hand-in-hand with bitstreaming audio, and let the components that are designed for this do the work.

Some good news; the patch to the FFMPEG tinterlace filter has been adopted (modified slightly) by the FFMPEG maintainers.

I'm now trying to see if it is possible to do something similar for VDPAU output. I ruled out the getbits and putbits method because for 1080i this would involve something like 350mbytes/s transfers which I suspect ION would not be able to achieve. Instead I'm trying the NV_vdpau_interop nVidia OpenGL extensions method to manipulate video using OpenGL. It's a steep learning curve for me.

The problem with this approach is it requires the use of OpenGL header files specific to nVidia drivers. Do the XBMC developers have any kind of policy on the use of OpenGL extensions specific to a particular vendor?

More good news. After some advice from FernetMenta I dropped the FFMPEG filter approach. I successfully implemented a new interlaced output mode within the XBMC render manager, I have called RENDER_WEAVEX2. The mode can work with software rendering and shader rendering modes. It can also work with VDPAU rendering however to achieve this, this fork of XBMC must be modified with the new mode:https://github.com/FernetMenta/xbmc

I will request that the new mode be implemented in both the official and FernetMenta XBMC repos.

Current limitations are WEAVEX2 works well in 1080i output modes with perfect field synch, but 576i and 480i modes are not possible yet because of the pixel doubling requirement of HDMI at these low resolution modes. I have tested 576i video at original size within 1080i mode (imagine big black borders round a small picture) and the TV quite happily accepts that, but not suitable for comfortable viewing.

An alternative approach for 576i and 480i might be to do what broadcast digital video effects devices did when they had to scale interlaced video.

If you scale an interlaced frame as a frame, you end up mangling your interlaced fields together and you get all sorts of nastiness. (Motion judder, odd banding on motion etc.)

However if you scale in the field-based domain, you avoid this. Of course you don't get the benefit of the improved vertical resolution that might be possible with a high quality de-interlace, scale (in the 2x frame domain) and then re-interlace - but it might be an option? (As most DVEs were shrinking rather than zooming pictures the resolution loss was less of an issue)

Effectively you'd scale each 240 (480i) or 288 (576i) line field to a 540 line field (1080i). As you are scaling within the field domain you don't end up with mangled fields.