On 3 Feb 2009, at 00:58, Theunis Potgieter wrote:
There is a VGA -> RGB connector that you can assemble/buy. I've seen in the xorg.conf that you can still force your VGA out to run on
50Hz,
by disabling EDID on the nvidia xorg.conf Options.
Yes this is what I'll be trying, but I'll just get an nvidia component breakout box, they're about $10 on ebay.
It appears that the component output breakout cable still uses the tvencoder chip..
I was able to avoid tearing finally (!) by disabling the composite extension.
Section "Extensions" Option "Composite" "Disable" EndSection
I then got pixel perfect component pal output without overscan. But now i'm left with.. judder !!!
Watching live sports i get judder, it almost looks like it's dropping every even or odd field.. It might be that it's just that field parity is not observed, but it's a bit hard to make out.
Am beginning to think that nvidia hardware is useless for 50Hz pal output..
On Mon, 16 Feb 2009 02:02:37 +1000 Torgeir Veimo torgeir@pobox.com wrote:
Watching live sports i get judder, it almost looks like it's dropping every even or odd field.. It might be that it's just that field parity is not observed, but it's a bit hard to make out.
By "judder" do you just mean a slight lack of smoothness or is it quite severe with moving objects jumping backwards each frame? If the former it's probably dropping fields (or deinterlacing to 25fps), if the latter it's got the field parity wrong.
Am beginning to think that nvidia hardware is useless for 50Hz pal output..
Or at least it is with the current drivers. Would it be straightforward to replace your card with an old AGP one? Patched ATI drivers and xinelib, or A Matrox G4x0 (or 550?) with DirectFB are probably more suitable for a CRT TV.
On 16 Feb 2009, at 02:35, Tony Houghton wrote:
On Mon, 16 Feb 2009 02:02:37 +1000 Torgeir Veimo torgeir@pobox.com wrote:
Watching live sports i get judder, it almost looks like it's dropping every even or odd field.. It might be that it's just that field parity is not observed, but it's a bit hard to make out.
By "judder" do you just mean a slight lack of smoothness or is it quite severe with moving objects jumping backwards each frame? If the former it's probably dropping fields (or deinterlacing to 25fps), if the latter it's got the field parity wrong.
It's a bit hard to tell, as it's a 100hz tv, doing its own deinterlacing as well. I'd guess it's a field parity issue though. I did see some "non-linear" judder watching football last night, but it might as well be the tv deinterlacer being confused due to field parity being wrong.
Am not sure if vdpau actually turns off deinterlacing, it says "enabled features", but the value is set to 0. (vdpau patched xine-lib)
vo_vdpau: recreate mixer to match frames: width=720, height=576, chroma=0 vo_vdpau: enabled features: temporal=0, temporal_spatial=0 vo_vdpau: enabled features: inverse_telecine=0 vo_vdpau: disable noise reduction. vo_vdpau: disable sharpness. vo_vdpau: vdpau_update_csc: hue=0.000000, saturation=1.000000, contrast=1.000000, brightness=0.000000, color_standard=0 vo_vdpau: output_surface size update
The weird thing is that if I reenabled composite extension, the judder is gone, but then I see tearing of course.
I run xine as
xine --verbose=2 -V vdpau -A oss -F -r square --post vdr_audio --post upmix_mono "vdr:/tmp/vdr-xine/stream#demux:mpeg_pes"
Am beginning to think that nvidia hardware is useless for 50Hz pal output..
Or at least it is with the current drivers. Would it be straightforward to replace your card with an old AGP one? Patched ATI drivers and xinelib, or A Matrox G4x0 (or 550?) with DirectFB are probably more suitable for a CRT TV.
Of course a matrox card gives the best output with correct field parity, but I'm interested in testing out vdpau acceleration.
It would be good to have a vdr recording of some test material that can be used to detect judder, tearing and other artifacts..
On 16 Feb 2009, at 12:12, Torgeir Veimo wrote:
On 16 Feb 2009, at 02:35, Tony Houghton wrote:
On Mon, 16 Feb 2009 02:02:37 +1000 Torgeir Veimo torgeir@pobox.com wrote:
Watching live sports i get judder, it almost looks like it's dropping every even or odd field.. It might be that it's just that field parity is not observed, but it's a bit hard to make out.
By "judder" do you just mean a slight lack of smoothness or is it quite severe with moving objects jumping backwards each frame? If the former it's probably dropping fields (or deinterlacing to 25fps), if the latter it's got the field parity wrong.
It's a bit hard to tell, as it's a 100hz tv, doing its own deinterlacing as well. I'd guess it's a field parity issue though. I did see some "non-linear" judder watching football last night, but it might as well be the tv deinterlacer being confused due to field parity being wrong.
Andy Ritger (nvidia) said in a mail to the xorg mailing list some time ago;
"If the application doesn't enable de-interlacing, NVIDIA's VDPAU implementation will currently copy the weaved frame to the "progressive" surface, and whether it will come out correctly will depend whether the window's offset from the start of the screen is odd or even."
I take this to imply that field parity should be possible, but the application in use have to detect the field flag from the source material and set the Y offset appropriately to be 0 or 1 accordingly.
Progressive material seems to display correctly.
On Mon, 16 Feb 2009 13:26:18 +1000 Torgeir Veimo torgeir@pobox.com wrote:
Andy Ritger (nvidia) said in a mail to the xorg mailing list some time ago;
"If the application doesn't enable de-interlacing, NVIDIA's VDPAU implementation will currently copy the weaved frame to the "progressive" surface, and whether it will come out correctly will depend whether the window's offset from the start of the screen is odd or even."
I take this to imply that field parity should be possible, but the application in use have to detect the field flag from the source material and set the Y offset appropriately to be 0 or 1 accordingly.
It also relies on vsyncing correctly. If it syncs to the wrong field you'll get the backward juddering. I don't think VDPAU provides a way for applications to distinguish between odd and even vsyncs, but perhaps it does its own internal syncing.
And if the video has to be scaled it would have to scale each field separately then reinterlace them line-by-line at the output resolution.
On 16 Feb 2009, at 23:29, Tony Houghton wrote:
And if the video has to be scaled it would have to scale each field separately then reinterlace them line-by-line at the output resolution
Ok, I guess this single issue implies that vdpau is not fully suitable for displaying interlaced material with interlaced output.
On Mon, Feb 16, 2009 at 11:48:50PM +1000, Torgeir Veimo wrote:
On 16 Feb 2009, at 23:29, Tony Houghton wrote:
And if the video has to be scaled it would have to scale each field separately then reinterlace them line-by-line at the output resolution
@Tony Houghton: is it really possible to scale each field separately without producing artifacts? Isn't deinterlacing always neccessary prior to scaling? IMHO scaling does imply that even lines are allowed to blur into odd lines if the scale factor forces this.
But artifact free blur of both fields is NOT possible if you either - don't deinterlace prior to scale or - keep even and odd fields separated during scale
Ok, I guess this single issue implies that vdpau is not fully suitable for displaying interlaced material with interlaced output.
@Torgeir Veimo: not sure about this. Intel series i9xx graphics is able to scale by hardware even in interlaced mode. I use this feature for my intel based frame rate control patches (SCART/RGB/PAL vga-sync-fields patch).
- Thomas
On Mon, 16 Feb 2009 21:28:56 +0100 Thomas Hilber vdr@toh.cx wrote:
On Mon, Feb 16, 2009 at 11:48:50PM +1000, Torgeir Veimo wrote:
On 16 Feb 2009, at 23:29, Tony Houghton wrote:
And if the video has to be scaled it would have to scale each field separately then reinterlace them line-by-line at the output resolution
@Tony Houghton: is it really possible to scale each field separately without producing artifacts? Isn't deinterlacing always neccessary prior to scaling? IMHO scaling does imply that even lines are allowed to blur into odd lines if the scale factor forces this.
But artifact free blur of both fields is NOT possible if you either
- don't deinterlace prior to scale
or
- keep even and odd fields separated during scale
You're right. I was thinking of cases where there's a lot of movement and the fields are from different pictures, forgetting that they can alternatively add detail to static scenes.
Ok, I guess this single issue implies that vdpau is not fully suitable for displaying interlaced material with interlaced output.
@Torgeir Veimo: not sure about this. Intel series i9xx graphics is able to scale by hardware even in interlaced mode. I use this feature for my intel based frame rate control patches (SCART/RGB/PAL vga-sync-fields patch).
VDPAU is currently only on NVidia cards although it's possible Intel and even ATI may provide backends for it in the future. I think only the most recent Intel chipsets support H.264 decoding, does your patch work with those? Once there's a stable API for their H.264 decoding I guess we'll have the best of both worlds :-).
On Mon, Feb 16, 2009 at 09:05:04PM +0000, Tony Houghton wrote:
not sure about this. Intel series i9xx graphics is able to scale by hardware even in interlaced mode. I use this feature for my intel based frame rate control patches (SCART/RGB/PAL vga-sync-fields patch).
VDPAU is currently only on NVidia cards although it's possible Intel and
right. But I thought that NVidia could eventually provide the same scale-interlaced-picture hardware feature as Intel does.
even ATI may provide backends for it in the future. I think only the most recent Intel chipsets support H.264 decoding, does your patch work with those? Once there's a stable API for their H.264 decoding I guess
the patch works with older (pre-avivo) Radeons and recent i9xx Intel chipsets. Currently I only support antiquated RGB/SCART:-)
Though 'durchflieger' of vdr-portal.de also supports H.264 compatible video modes with his own patches.
we'll have the best of both worlds :-).
right! It's nice to see how HDTV finally finds its way to Linux without expensive extension cards.
- Thomas
And if the video has to be scaled it would have to scale each field separately then reinterlace them line-by-line at the output resolution
Ok, I guess this single issue implies that vdpau is not fully suitable for displaying interlaced material with interlaced output.
but what about hdmi output from vdpau cards to LCD TV SET ?
is that issue also actually or not ? or that problem exists only for s-video/rgb/component video fro CRT TV SET >
Goga
On 19 Feb 2009, at 04:59, Goga777 wrote:
And if the video has to be scaled it would have to scale each field separately then reinterlace them line-by-line at the output resolution
Ok, I guess this single issue implies that vdpau is not fully suitable for displaying interlaced material with interlaced output.
but what about hdmi output from vdpau cards to LCD TV SET ?
is that issue also actually or not ? or that problem exists only for s-video/rgb/component video fro CRT TV SET >
It's an issue with interlaced output when the source material is interlaced, and no deinterlacing is performed by the gfx card. It doesn't matter if it's hdmi, component or s-video output that is being used.
And if the video has to be scaled it would have to scale each field separately then reinterlace them line-by-line at the output resolution
Ok, I guess this single issue implies that vdpau is not fully suitable for displaying interlaced material with interlaced output.
but what about hdmi output from vdpau cards to LCD TV SET ?
is that issue also actually or not ? or that problem exists only for s-video/rgb/component video fro CRT TV SET >
It's an issue with interlaced output when the source material is interlaced, and no deinterlacing is performed by the gfx card.
thanks. Last questions :) If the GPU deinterlacing is performed by the VDPAU card with the different deiinterlacing algos that issue exist ?
Goga
On Mon, Feb 16, 2009 at 01:29:22PM +0000, Tony Houghton wrote:
On Mon, 16 Feb 2009 13:26:18 +1000 Torgeir Veimo torgeir@pobox.com wrote:
Andy Ritger (nvidia) said in a mail to the xorg mailing list some time ago;
"If the application doesn't enable de-interlacing, NVIDIA's VDPAU implementation will currently copy the weaved frame to the "progressive" surface, and whether it will come out correctly will depend whether the window's offset from the start of the screen is odd or even."
I take this to imply that field parity should be possible, but the application in use have to detect the field flag from the source material and set the Y offset appropriately to be 0 or 1 accordingly.
It also relies on vsyncing correctly. If it syncs to the wrong field you'll get the backward juddering. I don't think VDPAU provides a way for applications to distinguish between odd and even vsyncs, but perhaps it does its own internal syncing.
Might be a good idea to ask Andy Ritger about that vsync to correct fields..
Hopefully Nvidia is able to fix/provide that.. VPDAU seems good otherwise..
-- Pasi
What is the input resolution and do they change? What is your output resolution set to?
On 2/15/09, Torgeir Veimo torgeir@pobox.com wrote:
On 3 Feb 2009, at 00:58, Theunis Potgieter wrote:
There is a VGA -> RGB connector that you can assemble/buy. I've seen in the xorg.conf that you can still force your VGA out to run on
50Hz,
by disabling EDID on the nvidia xorg.conf Options.
Yes this is what I'll be trying, but I'll just get an nvidia component breakout box, they're about $10 on ebay.
It appears that the component output breakout cable still uses the tvencoder chip..
I was able to avoid tearing finally (!) by disabling the composite extension.
Section "Extensions" Option "Composite" "Disable" EndSection
I then got pixel perfect component pal output without overscan. But now i'm left with.. judder !!!
Watching live sports i get judder, it almost looks like it's dropping every even or odd field.. It might be that it's just that field parity is not observed, but it's a bit hard to make out.
Am beginning to think that nvidia hardware is useless for 50Hz pal output..
-- Torgeir Veimo torgeir@pobox.com
vdr mailing list vdr@linuxtv.org http://www.linuxtv.org/cgi-bin/mailman/listinfo/vdr
On 17 Feb 2009, at 18:11, Theunis Potgieter wrote:
What is the input resolution and do they change? What is your output resolution set to?
Input is either dvb-t or dvb-c, and can be eg 540x576, 740x480, 720x576 or even 720p or 1080i sometimes. It just depends on what the broadcaster is sending. Output is either set to 720x576 or 1024x576.