[vdr] 1080p ready VDR
theunis.potgieter at gmail.com
Mon Dec 29 08:55:45 CET 2008
On 26/12/2008, Tony Houghton <h at realh.co.uk> wrote:
> On Thu, 25 Dec 2008 23:54:59 +0100
> Sascha Vogt <FunkyFish at gmx.net> wrote:
> > Artem Makhutov schrieb:
> > > You can also use a Nvidia Video card with VDPAU, which will do the
> > > decoding in hardware. Then then CPU-Load will be less then 10%, but
> > > this is currenly in development and does not work reliably right now.
> > > Using VDPAU you will be able to watch HDTV with a Atholon X2 4850e
> > > CPU.
> > As there are hardly any HD programs atm I think I'll go that way. I feel
> > much more comfortable with a NVIDIA anyway (all my other PCs have one
> > too). Hopefully that VDPAU will get stable soon (Soon as in till 2010).
> > Point is the 4850e has a TDP of 45 Watts which is much more easier too
> > cool than those "bigger" CPUs.
> I think that's a good idea. VDPAU may still be a bit too cutting edge
> for comfort, but support for it is bound to improve, and HD is still
> cutting edge in VDR in general (I haven't heard of anyone releasing a
> patch for the BBC HD audio pid problem yet).
> I chose a 4850e too. At first I tried ATI 3200 onboard graphics, like
> you thought of, but it was dreadful, very unstable with ATI's fglrx
> driver, while the open radeonhd driver doesn't even support Xv on it yet
> AFAICT. I replaced it with an NVidia 8200 based board. I think there's a
> performance problem with the 177 series nvidia driver on basic
> acceleration, but it's fine with Xv and OpenGL, stable, and hopefully
> the performance is fixed in 180. I haven't tried VDPAU yet.
> IME the 4850e can playback 720p H.264 quite easily, but can't manage
> 1080i from DVB. It might be trying to run a deinterlacing filter that
> pushes the demand over the limit. I'm quite disappointed that they
> standardised on 1080i, even for movies by the look of it, instead of
deinterlacing is only required if your source/stream runs at a
different refresh rate than your output monitor. The problem with
nvidia and the tv-out adapter(s-video plug) is that they decided to
lock their driver(closed source) to 60Hz after a certain version, for
newer cards and your output happens to be a CRT screen/monitor. PAL,
happens to run mostly on 50Hz. You would see tearing and/or the
individual fields if you do not enable a deinterlacer, which requires
more cpu load. Offloading it to the gpu could help. The best would be
to get your output device to run at the same refresh rate. Now there
are different kinds of approaches: playing with your xorg.conf, to
disable EDID of the video driver and you define a mode that tells your
video driver you want 50Hz on your LCD, CRT/Plasma or on ATI there is
a RGB(red, green, blue) patch, that requires you to plug in an adapter
cable/board from your vga plug to your output monitor. Look for "PCI
fun (RGB/PAL over VGA at variable frame rate)" in the mailing list. So
there is another trick of saving on your electricity bill.
eHD works for up to 1080i which is fine for DVB-S2 etc, and it seems
they don't want to go 1080p because of bandwidth constraints. But what
happens if you want to watch DivX/Xvid, or happen to have a blu-ray
device capable of 1080p source. Then the eHD is not going to help you
and you need software (to transcode to a format the eHD can output).
So decide what the main purpose of the machine will be, unless if you
have the will and patience to fine tune your software machine setup.
More information about the vdr