[vdr] which tv-out card?

Niko Mikkila nm at phnet.fi
Tue Nov 15 01:44:53 CET 2005


On Mon, 14 Nov 2005 23:49:07 +0000
Jon Burgess <jburgess at uklinux.net> wrote:

> This doesn't appear to be just a pure interlacing problem. I've tried 
> tweaking the 50Hz TV-out of EPIA, NVidia & ATI VGA systems and none of 
> them match the quality of hardware designed for dedicated TV usage, e.g. 
> DXR3, PVR350 (and presumably the FF cards although I've not tried one 
> myself.).

NVidia and ATI cards filter the display by default to reduce flickering.
Of course, this destroys interlacing as well. Some more advanced tv-out
configuration programs allow one to turn the filter off on some chips, but
that won't be enough. The video decoder or player still needs to synchronize
its display to the output fields so that they are always displayed in the
right order. AFAIK only the Matrox cards have an interrupt that makes this
possible in the software side.

For a very detailed discussion on these issues, see VDR mailing list archives
in October, 2004. The thread starts here, but you'll need to read it through
to see the situation (which hasn't changed as of now) completely:
http://www.linvdr.net/mailinglists/vdr/2004/10/msg00922.html

Luckily the interlaced CRT televisions are being phased out and soon we'll
only have nice progressive panels and projectors. Just waiting for the
1080p @ 50Hz broadcasts ;)

--
Niko Mikkilä



More information about the vdr mailing list