2008-05-16, 07:55 PM
I have a question about deinterlacing of HD signal.
I am getting a 1080i signal from my HDHomerun. I am using the Nvidia Purevideo decoder with deinterlace set to automatic. (GBPVR set to encoder pass-through) My video card is an NVidia Geforce 6600 set to output over component cables with a 1080i signal to my HDTV which is a Toshiba CRT with 1080i native resolution on its component inputs.
My question is this. Will the Purevideo decoder deinterlace the 1080i signal only to have it re interlaced by the video card? or is the decoder smart enough to know not to deinterlace 1080i signals. Or am I thinking about it all wrong.
I am getting a 1080i signal from my HDHomerun. I am using the Nvidia Purevideo decoder with deinterlace set to automatic. (GBPVR set to encoder pass-through) My video card is an NVidia Geforce 6600 set to output over component cables with a 1080i signal to my HDTV which is a Toshiba CRT with 1080i native resolution on its component inputs.
My question is this. Will the Purevideo decoder deinterlace the 1080i signal only to have it re interlaced by the video card? or is the decoder smart enough to know not to deinterlace 1080i signals. Or am I thinking about it all wrong.
Bill
--------
GIGABYTE GA-EP45-UD3P
Core 2 Duo E7400 2.80GHz
Asus GeForce 9800GT Hybridpower/HTDI/512M
1 250Gbyte and 1 1.5 Tbyte Seagate SATA I drives
4 Gbyte RAM
Hauppauge WinTV PVR USB2
HDHomerun
Windows 7 Enterprise 64bit
NPVR 2.0.3
--------
GIGABYTE GA-EP45-UD3P
Core 2 Duo E7400 2.80GHz
Asus GeForce 9800GT Hybridpower/HTDI/512M
1 250Gbyte and 1 1.5 Tbyte Seagate SATA I drives
4 Gbyte RAM
Hauppauge WinTV PVR USB2
HDHomerun
Windows 7 Enterprise 64bit
NPVR 2.0.3