Let me clear up a few things before some people get confused about this whole i vs p thing. The concept of interlaced came about during the era of CRT. It is the process where by even lines and odd lines are drawn on the screen one after the other. This allowed them to use half the bandwidth. In the era of digital transmission, specifically ATSC, there are two standards for HD content. 720p (1280x720) and 1080i (1920x1080). All HD TV will be able to receive both of these content. Only CRTs have the capability of showing 1080i content natively. All digital pixel based displays are always progressive scan and cannot display 1080i content natively. So, you need to look at the native pixels of the TV. Some TVs are 1280x720, several cheaper plasmas are 1024x768, many LCDs are 1366x768, the higher end LCDs are 1920x1080, several DLP and LCoS are also 1920x1080.
These pixel based displays can support multiple resolutions at the input side such as 480i, 480p, 720p, 1080i and even 1080p but they will have to convert this to the native pixel using a process of deinterlacing and scaling. Some TVs have good scalers, some don't. That is why a TV with a good scaler might show video content received in its native format well, but any other content look bad.
So, if a TV is advertised as HDTV and it says 720p native, don't fret. It will support 1080i, 720p, 480p and most probably 480i (may be not through HDMI but through component, etc). In fact, I bought an inexpensive 37" LCD that is 1366x768 native, but it supports 1080p. It just downconverts to the native pixels. I confirmed this by connecting my PS3 to it and setting the resolution on the PS3 to 1080p.
I hope this clears up a few things.