2006-11-04, 04:25 PM
I had been running my graphics output at 1024x768 when using VGA output to my 42" plasma monitor (HD/1080i-capable) -- with GBPVR, of course -- but when I installed an HD tuner (Cat's Eye 150) a couple of days ago, I decided to upgrade my NVidia drivers (and also to change to the DVI/HDMI output on the NVidia card). The NVidia card/drivers detected my plasma and changed my resolution (automatically) to "optimized for HDTV" (1920x1080). Now everything played thru GBPVR plays in a square box in the middle of my plasma panel (vs. a rectangle filling the whole panel).
I guess my general question is -- what resolution should I be running my graphics card to watch full-screen television (both analog with my PVR-150 and OTA ATSC with my Cats' Eye 150)?
I've tried switching to overlay (vs. VMR9), but still get the square in the middle.
Thanks!
Kevin T.
===============================
Specs:
Celeron 3Ghz, 512MB DDR2, two 160GB hard drives
NVidia GeForce 6200 (128MB/64-bit, vga/svideo/dvi out), NVidia PureVideo Decoder
Hauppauge PVR-150
VBox Cat's Eye 150 (DTA-150)
GBPVR
Akai 42" HD-capable (1080i) plasma tv (no internal tuners) -- connected via DVI->HDMI cable to my GeForce card
I guess my general question is -- what resolution should I be running my graphics card to watch full-screen television (both analog with my PVR-150 and OTA ATSC with my Cats' Eye 150)?
I've tried switching to overlay (vs. VMR9), but still get the square in the middle.
Thanks!
Kevin T.
===============================
Specs:
Celeron 3Ghz, 512MB DDR2, two 160GB hard drives
NVidia GeForce 6200 (128MB/64-bit, vga/svideo/dvi out), NVidia PureVideo Decoder
Hauppauge PVR-150
VBox Cat's Eye 150 (DTA-150)
GBPVR
Akai 42" HD-capable (1080i) plasma tv (no internal tuners) -- connected via DVI->HDMI cable to my GeForce card