I think cal87 asks a reasonable question - enquiring minds want to know!
The 19.4meg max. bitrate in a 6MHz channel allocation is something that the content producers can manage to their advantage. Whereas a true HD signal consumes that whole capacity, there are many lower-rez options available to the content providers that can be further managed by the carriers. High-action HD content will need the full channel capacity to look its best. Other content can look nearly as good at lower rates while allowing additional programming in the same allocation. The point is, the guy at the switch will always want to compromise quality for quantity, regardless of the compression techniques employed, when it's more economically beneficial for him to do so.
The current technology, number of available transponders, etc. are only part of the equation. Market demand may have a larger impact on how the available technology is utilized. If consumers demand more HD content at higher quality and are willing to pay the $$ to support that demand, then we might see a migration to more higher quality content. Otherwise the supply side will likely favor more channels at lower quality that will provide more commercial revenue and largest ROI. Where that will all balance out is the real question....
It's like I feel when I'm driving my car - I don't have to know what's happening under the hood to get where I'm headed, and most of the time I don't care. But the gear-head in me occassionally wants to know how that exhaust-induced dose of steriods known as turbo charging is managed most efficiently to get me the performance I crave...