This Wikipedia article is probably better than I can explain it:
http://en.wikipedia.org/wiki/Interlace
HDTV should not be on the screen if the picture isn't at least 1080i. 1920 x 1080 is still 1080p, progressive scan, as your screen ration is 16:9, wider than it is higher.
It's all like the old computer mantra, "garbage in, garbage out." You could have the highest resolution but if you, for instance, play a VHS tape, it will not improve the picture. In fact, it likely will show up the flaws in picture.
720i is EDTV, Enhanced Defintion Television. TBS is one who might slip that HDTV symbol in,when showing say, re-runs of "The X Files," but the resolution looks like 420p to me. The trouble is, these are not filmed with a hi-def analog camera, let along a digital camera. The first plasmas were really 800 x 800 pixels and the monitors are still made, mostly for commercial applications like a sports bars where the viewers are mostly quite distant from the monitor. They are progressive scan so are actually 720p. A 1080i is not progressive scan but when playing a regular DVD on a progressive scan player, it will deliver 420p.
It's going to take some time to get the best out of the 1080p displays. The prevalent advantage now is the fineness of the pixel matrix on a big screen as you can be close to the screen than with 1080i. The interlacing itself causes some sawtooth edges on sharp lines when viewed very close. It's just that you're really not getting any more detail even in a transferred movie because it wasn't there in the first place. If you notice on Discovery, if it's digitally shot, you can see every folical of hair and the pores on people's faces. It's not just make-up that you wouldn't see that on the Hi Def DVD of, say, "The Wedding Crashers" because that fine a detail is not present in theatrical films.
Yes.
Thank u for all of this wonderful information.
U r a treasure trove.
David