2005-11-19, 01:30 AM
Pardon the question from a video neubie but...
I have my GBPVR up and running on a P3 833Mhz. Seems to work fine and I have a Nvidia MX4000 with S video output.
I noticed in the config that the choice of decoder makes no difference to the TV out. Is that because it is being decoded by the video card?
I don't have any need to have the video displayed on the computer display and I was wondering if it take extra processor to decode for the display as well as the TV. In other words, could I lower my CPU utilization by not displaying the video on the computer display, perhaps by uninstalling the codecs? Or is there another way.
I would appreciate any light anyone can shed on this.
I have my GBPVR up and running on a P3 833Mhz. Seems to work fine and I have a Nvidia MX4000 with S video output.
I noticed in the config that the choice of decoder makes no difference to the TV out. Is that because it is being decoded by the video card?
I don't have any need to have the video displayed on the computer display and I was wondering if it take extra processor to decode for the display as well as the TV. In other words, could I lower my CPU utilization by not displaying the video on the computer display, perhaps by uninstalling the codecs? Or is there another way.
I would appreciate any light anyone can shed on this.