2008-03-28, 09:55 PM
I sometimes don't have my monitor up yet before the computer comes on and it autodetects it and thinks it's missing. The card then proceeds to screw up the resolution. I would like to know if there's a way to force a resolution to stay put. I have an older machine with an Nvidia card and this appears to be the default behavior for Nvidia but not for ATI.
[SIZE="1"]Server rebuild:GIGABYTE GA-MA78GM-S2H, CPU AMD X2 4850e 2.5GHz 45W,2GB RAM,500GB HD,Hauppauge HVR-1600,Vista Ultimate SP1,IN WIN BK623 Mini case
Client:ASUS AMD M3A78-EMH HDMI motherboard,CPU AMD64 X2 3800+,2GB RAM,200GB HD,ATI HD3200 integrated graphics, WIN XP SP2 Pro,MCE303 case, 2x16 VFD, Irtrans MCE remote [/SIZE]
[SIZE="1"] How to Build your own GB-PVR HTPC computer[/SIZE]
[SIZE="1"]GB-PVR in action on YouTube[/SIZE]