You also mention: Alvy Ray, conveniently leaves out key factors in his argument for 720P. He also fails to address the fact that Interlace scanning is designed to display on a screen technology that is designed for interlace, while making examples of flicker that is seen on a monitor where interlace is displayed on a progressive scan device, rather he states categorically that progressive scan is superior in every way while the truth is that is is not. He loves to compare apples to oranges, such as comparing maximum specs on 720P to what is broadcast (practice) in 1080I. Hidden in his own words he covers is butt by stating a number set in a chart that shows when comparing specs that his whole argument is false, i.e. the 1080i pixel count is less than 720P pixel count unless he thinks that 62M < 55M. As I said he covers his butt but uses an apples oranges comparison to make his point.
Fact is- 720P looks better, less flicker, on a monitor designed for progressive scan while on a monitor designed for interlace scan the interlace signal will look better, assuming both are capable of displaying all the resolution the signal has present.
It is true that the broadcast practice is to limit the horizontal pixel count to 1440 rather than the spec of 1920 due to popular use of HDCAM equipment. This does not make the ATSC spec change, it just makes the practice have room to grow as the equipment gets better in the future.
FWIW- I prefer 720P native output from my equipment here because it is what my native display is designed for. It is a digital 720x1280 display. However, my other backup display is an analog CRT FPTV and that is best used with 1080i 30 signal.
The real gem is 1080p 30fps or even better, 1080p 60fps but unfortunately those exceed the broadcast bandwidth and will only be realized in the future with closed systems such as HD-DVD.
My main disagreement with Smith is that he tries to make an argument for name calling of a technology by stating that one display format should apply in all cases. I feel he is wrong and that matching up the proper display signal to the display technology maximizes the image and eliminates the very artifacts he is complaining about. Much of his argument makes about as much sense as the old argument that dpi on a printer is equal to LPI is equal to pixels which is only true if you leave out certain facts, or in math, certain parts of the equation.
Bottom line- If you use a progressive digital display, use 720P
If you use an analog phosphor screen you can use either 720P or 1080i but 1080i may give you a higher detail depending on it's ability to resolve differences between 1280 and 1440. To my knowledge there are no broadcasts yet claiming 1920 pixels except those with full HDTV sourcing from D5 and not being bandwidth restricted below the full allotted 19.4. The system allows for this but it is not being practiced on any wide scale.