On Wed, Jul 23, 2008 at 12:12:46AM +0200, Martin Emrich wrote:
I have connected my VDR box to my TV via a DVI-to-HDMI cable, set the resolution to 1920x1080 and let the graphics card do the upscaling instead of the TV, because the quality looks IMHO better this way. But
ok. But if doing so you still have to continue deinterlacing in software. This is because any scaling in Y dimension intermixes even/odd fields in the frame buffer. Finally producing a totally messed VGA output signal.
Scaling in X dimension however is always allowed. E.g. for switching between 4:3/16:9 formats on a 16:9 TV-set.
here still the same problem is present, the refresh rate of the graphics card is not bound to the field rate of the incoming TV signal, so I can either disable sync-to-vblank and have tearing artefacts or enable it and have an unsteady framerate.
right. Even if you still must use software deinterlacing for some reason you benefit from the 'sync_fields' patch. You then can enable sync-to-vblank and the patch dynamically synchronizes graphics card's vblanks and TV signal's field updates. Thus avoiding unsteady frame rates at VGA/DVI/HDMI output.
I wonder if your patch could be applied to a DVI/HDMI connection, too? Its a Radeon X850 currently with xf86-video-ati 6.6.3 and xorg-server 1.4.
In your case the only prerequisite is support of your Radeon X850 by Radeon DRM driver. DRM normally is shipped with kernel. So this is a kernel/driver issue. But I don't expect problems here though I not yet testet the X850 myself (yet).
-Thomas