NextPVR Forums

Full Version: video card problems
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4 5
To start I have read and re-read and done everything in: http://forums.nextpvr.com/showthread.php?t=16157. And still having problems with fuzzy picture. Its not bad, but I can definatly tell the difference compared to my regular TV. This might be my TV making up for the signal with comb filtering I'm not sure. I have a Sony 27" Trinitron.

I was running a FX5600 Ultra with svideo and then because of that thread I decided to get a 6200. I put in the 6200 and have uninstalled and reinstalled drivers as well as Purevideo(what im using for decoder). I am still not seeing the "VMR Default' for deinterlacing options. All I see are like 'Best Available' 'Combine' and something else. When I was using the 5600Ultra I did see 'VMR Default' 'Pixel Adaptive' etc. Am I right in assuming that 'VMR Default' would de hardware, where 'Best Available','Combine' would be software? Does this mean that Purevideo is not picking up on the fact that this 6200 has hardware deinterlacing? The only reason I bought this card was for the hardware deinterlacing advantage. If anyone can either correct me if I'm wrong, tell me how to fix it, or give me some other advice on this I would appreciate it.

Now 2nd thing, would I be better off returning the 6200 and getting a Radeon 9550 instead? I read alot of different posts around some talk up Nvidia some talk up ATI. I know it is probably more personal preference, but there has to be some hard fact as to which works better for SDTV Svideo. Some recommendations would be great, even if they are just your opinion.
How long is your S-Video cable? I've seen cases where the compsite output was much sharper than S-Video if the cable was too long or not well shielded.
lseats Wrote:How long is your S-Video cable? I've seen cases where the compsite output was much sharper than S-Video if the cable was too long or not well shielded.
I would say about 6ft or so. I've used several different Svideo cables, some thicker than others (not sure on official shielding of such cables). I've tried composite as well, no difference. I'm willing to accept that my svideo cable could be causing degradation but I still want to solve the problem I'm having with PureVideo before I go spend more money on a cable.
One other thing. How do you change the 'Decoder Format' in PureVideo? I've seen people with 'DirectX VA mode B' as well as C. Have heard this can make your picture better on certain TV's.
dirtymafia Wrote:Am I right in assuming that 'VMR Default' would be hardware, where 'Best Available','Combine' would be software?

You are correct. There is a trick to make this work.
1) Ensure GBPVR is configured to use the Purevideo decoder and GBPVR is configured to VMR9 (or VMR9 custom)
2) Go into Purevideo dacoder settings and tick "Hardware Acceleration" and "Prefer VMR9" ans click "OK" to apply settings.
3) Run GBPVR LiveTV or Video so that Purevideo is run at least once
4) Go back into Purevideo settings and VMR Default will be shown, BUT YOU MUST "OK" THIS TO APPLY THE SETTING (even though it is already shown).

Quote:Now 2nd thing, would I be better off returning the 6200 and getting a Radeon 9550 instead? I read alot of different posts around some talk up Nvidia some talk up ATI. I know it is probably more personal preference, but there has to be some hard fact as to which works better for SDTV Svideo. Some recommendations would be great, even if they are just your opinion.

Some GBPVR users are experiencing aspect ratio problems with Nvidia 6 and 7 series cards when DXVA (hardware acceleration) is enabled. This is why I'm cautious about recommending Nvidia cards, see below link:
http://forums.nextpvr.com/showthread.php?t=16198

Nvidia cards are superior for PAL users because they have native support for 50Hz refresh rate on the component-out and VGA/DVI-out. ATI doesn't have this feature and any other refresh rate causes mild studdering on PAL interlaced content.

I have done direct comparisions between ASUS/ATI9550 and Legend/Nvidia6600GT and in general found the composite and S-video sharper on the Nvidia card.
dirtymafia Wrote:One other thing. How do you change the 'Decoder Format' in PureVideo? I've seen people with 'DirectX VA mode B' as well as C. Have heard this can make your picture better on certain TV's.

You have no control over this. This is the Directshow interface mode negotiated between the video decoder/renderer/video-card. Mode A and C are used for MPEG2 playback and are the same except mode A places tighter restictions on the parameters that are negotiated. Nvidia cards use mode A, and ATI cards use mode C. Mode B is for DVD playback.
csy Wrote:You are correct. There is a trick to make this work.
1) Ensure GBPVR is configured to use the Purevideo decoder and GBPVR is configured to VMR9 (or VMR9 custom)
2) Go into Purevideo dacoder settings and tick "Hardware Acceleration" and "Prefer VMR9" ans click "OK" to apply settings.
3) Run GBPVR LiveTV or Video so that Purevideo is run at least once
4) Go back into Purevideo settings and VMR Default will be shown, BUT YOU MUST "OK" THIS TO APPLY THE SETTING (even though it is already shown).
So if I am not showing 'VMR Default','Pixel Adaptive' then I am NOT outsourcing the deinterlacing to the video card right?

If that is correct why did 'VMR Default','Pixel Adaptive' show up when I had my FX5600 installed?
dirtymafia Wrote:So if I am not showing 'VMR Default','Pixel Adaptive' then I am NOT outsourcing the deinterlacing to the video card right?

You are correct.

Quote:If that is correct why did 'VMR Default','Pixel Adaptive' show up when I had my FX5600 installed?

VMR Default will use the highest quality deinterlacing available on the card (all VMR capabable video cards support BOB deinterlacing minimum).
Pixel Adaptive is a generic VMR term that is advertised by the video card to the decoder during VMR 'connect' negotiations, and does not necessarily translate to genuine pixel adaptive deinterlacing. Using my ATI9550 for example, I have two Pixel Adaptive's, one does true pixel adaptive, the other does weave (no deinterlacing). Also there are different types of pixel adaptive deinterlacing. The Nvidia Purevideo feature set offers 'Spatial-Temporal De-Interlacing' which is an advanced pixel adaptive deinterlacing algorithm

Perhaps your FX5600 can do some form of pixel adaptive deinterlacing, but how good it it?
Nvidia state that the Purevideo feature set is only available starting from GeForce6 series, and in my opinion when you use the Purevideo deinterlacing the picture quality matches a stand-alone TV (other GBPVR users have commented the same).
http://www.nvidia.com/page/purevideo_support.html
csy Wrote:You are correct. There is a trick to make this work.
1) Ensure GBPVR is configured to use the Purevideo decoder and GBPVR is configured to VMR9 (or VMR9 custom)
2) Go into Purevideo dacoder settings and tick "Hardware Acceleration" and "Prefer VMR9" ans click "OK" to apply settings.
3) Run GBPVR LiveTV or Video so that Purevideo is run at least once
4) Go back into Purevideo settings and VMR Default will be shown, BUT YOU MUST "OK" THIS TO APPLY THE SETTING (even though it is already shown).

This sounds like wiki material if it's not already on there...
csy Wrote:You are correct. There is a trick to make this work.
1) Ensure GBPVR is configured to use the Purevideo decoder and GBPVR is configured to VMR9 (or VMR9 custom)
2) Go into Purevideo dacoder settings and tick "Hardware Acceleration" and "Prefer VMR9" ans click "OK" to apply settings.
3) Run GBPVR LiveTV or Video so that Purevideo is run at least once
4) Go back into Purevideo settings and VMR Default will be shown, BUT YOU MUST "OK" THIS TO APPLY THE SETTING (even though it is already shown).

I have read this and the other lengthy thread with lots of advice from csy - and thank you for being so generous with your expertise! I'm afraid I'm still having trouble and would like to pester you some more.

I have version 97.13 0f gbpvr installed, and an Nvidia 6200 video card feeding a TV set using S-video.
My old video card, a GEForce2, was giving such a fuzzy picture that I bought the 6200 and installed it. I set up the the Video Card driver settings to turn off anti-flicker, 720x480, 60 Hz, with the TV as thr primary display. I'm using the latest drivers from the Nvidia web site.

Now, I can get a slightly improved, but still fuzzy picture with the new card, using the Intervideo decoder that came with my Hauppague 150, so I downloaded the Nvidia PureVideo 1.02.223 software. I have a setup that allows me to switch back and forth between the PC and a VCR that I use as a tuner, for comparison of the picture quality.

I attempted to follow the instructions above, leaving everything at the default settings and changing only the ones mentioned above.
I used VMR9Custom in gbpvr since I read in the forum that it gives better results with Nvidia cards.
I set the PureVideo screen up as follows:
1) Hardware acceleration was already checked.
2) De-interlace control had 4 options: automatic, film, video, and smart. I picked automatic.
3) De-interlace mode had 3 options: Best Available, Display Fields Seperately, or Combine Fields. I picked Best Available.
4) Enhanced nView support had 3 settings: Prefer Overlay, Prefer VMR 7, and Prefer VMR 9. I picked VMR 9.
5) Display type had 4 options: Content default, Letterbox, Pan and scan, and Anomorphic/Raw aspect. I chose Anomorphic/Raw.
and then OK.

After I ran gbpvr with a live broadcast and go back to the PureVideo settings page, there is nothing that says VMR Default, but now the radio button for VMR7 is checked. I re-checked VMR9 and went back to live TV. This time the VMr9 button is selected when I go back to PureVideo.
In every case, the picture on the TV is "jittery" or "Ghosting" horizontally. From what I read here, that is a deinterlacing problem.
After experimenting with the various controls, "Combine Fields" works best but the picture is still jittery. The best mode I can get to work with PureVideo is "Overlay Manager", which gives me a picture that is just as fuzzy as with the Intervideo software. This also gives me a 30 second delay between clicking "Live TV" on the gbpvr menu and the time a picture actually appears.

Before installing PureVideo, I tried the registry changes you explained in the other thread for the Intervideo decoder. Like ubu, I found the settings in the Hauppague keys under HKLM. I did not find a corresponding entry under HKCU as you suggested, and changing the VideoDec value to 1 in HKLM mace no change in the picture.

Please, any help would be welcome.

Walt
Pages: 1 2 3 4 5