2006-01-17, 07:02 PM
i would imagine this could still be a deinterlacing issue. it is safe to say that the signal to your TV is always interlaced by the video card+drivers, but what happens to the signal before that could be anything (deinterlaced, scaled, sharpened, levels, then re-interlaced by the video card). any artifacts in the deinterlacing by the decoder will corrupt the video stream and will be seen in the final (re-interlaced) picture.
if you disable deinterlacing (weave) in the ATI CCC this may remove this problem of jittery video, but the picture may still not look good according the experiences of some of the other people in this thread. i don't know why that is, but i would imagine that the interlacer in the video card may not recognize that the video signal is already interlaced and is interlacing an interlaced video (see how i interlaced various forms of "interlace" there...). you can imagine if you are watching tv in a window on your desktop. i would think it would be tough for the software to recognize that part of the screen is already interlaced and somehow sync the progressive portion of the screen (the rest of your desktop) with this interlaced signal when interlacing the whole picture...
if you disable deinterlacing (weave) in the ATI CCC this may remove this problem of jittery video, but the picture may still not look good according the experiences of some of the other people in this thread. i don't know why that is, but i would imagine that the interlacer in the video card may not recognize that the video signal is already interlaced and is interlacing an interlaced video (see how i interlaced various forms of "interlace" there...). you can imagine if you are watching tv in a window on your desktop. i would think it would be tough for the software to recognize that part of the screen is already interlaced and somehow sync the progressive portion of the screen (the rest of your desktop) with this interlaced signal when interlacing the whole picture...