2009-04-11, 07:49 PM
What bitrate & resolution do you guys record at?
I had mine set @ 13Mbps constant (constant bitrate mainly because it made skipping easier with my PCH)
I have my cable box set to 720p (even though my set is 1080p) because I thought it would give me a better bitrate/pixel ratio. Does that matter at all?
Would a given bitrate be less prone to smearing & artifacts @ 720p than at 1080i? I could probably set the bitrate lower since Comcast compresses the crap out of everything anyway...
What say you guys? My HD set is 40" 1080p Samsung
I had mine set @ 13Mbps constant (constant bitrate mainly because it made skipping easier with my PCH)
I have my cable box set to 720p (even though my set is 1080p) because I thought it would give me a better bitrate/pixel ratio. Does that matter at all?
Would a given bitrate be less prone to smearing & artifacts @ 720p than at 1080i? I could probably set the bitrate lower since Comcast compresses the crap out of everything anyway...
What say you guys? My HD set is 40" 1080p Samsung
GBPVR v1.4.7
Windows 7 Ultimate (64-bit)
Intel Core 2 Duo 2.33 GHz
4 GB RAM, 160GB system drive
640GB recording drive
PVR-500 - analog cable stations
HDHomeRun - ASTC via antenna
nVidia GeForce 8600GT
1 PCH @ 1080p componenent (was NTSC via composite)
1 PCH @ 1080p HDMI (was component)
Windows 7 Ultimate (64-bit)
Intel Core 2 Duo 2.33 GHz
4 GB RAM, 160GB system drive
640GB recording drive
PVR-500 - analog cable stations
HDHomeRun - ASTC via antenna
nVidia GeForce 8600GT
1 PCH @ 1080p componenent (was NTSC via composite)
1 PCH @ 1080p HDMI (was component)