2006-06-09, 01:00 AM
You can match stand-alone TV picture quality by using pixel-adaptive hardware deinterlacing available on mid-range and upwards video cards.
S-video interface does degrade the picture quality but on a SDTV it only degrades the quality by a small amount. Try this for yourself by connecting your VCR to TV via S-video (or composite) and use the VCR tuner as a back-to-back comparison to the TV tuner.
S-video interface does degrade the picture quality but on a SDTV it only degrades the quality by a small amount. Try this for yourself by connecting your VCR to TV via S-video (or composite) and use the VCR tuner as a back-to-back comparison to the TV tuner.
[SIZE="1"]AMD Athlon X2 4200+ CPU, Gigabyte GA-MA770-DS3 mobo, 2GB RAM, 1TB SATA HDD, DigitalNow Dual Hybrid PCIE S2 and Hauppauge HVR2200 capture, ATI HD4670 video with HDMI-HDMI to 32" LCD TV at 1360x768, Win7 Home Premium 64bit, GBPVR 1.4.7, EVR renderer[/SIZE]