This post is so full of ignorance, I'm not sure where to start.
H.264 has been part of the ATSC standards since 2008. Broadcasters could use H.264 over ATSC if they wanted to, but they'd rather accommodate stubborn people who don't want to update their equipment to handle a codec that is not 16 years old, so we all must suffer with MPEG-2 because of people with MPEG-2 only equipment dragging us down.
Remember that digital conversion thing that took >10 years? Where Congress had to push the deadline multiple times and subsidize tuners so that NTSC could end? You expect to repeat that process barely three years later and make OTA viewers buy another new TV? Just because ATSC added AVC in A/73 doesn't mean anyone supports it. A/53 also has a 16VSB mode that nobody uses. The FCC
incorporated A/53 only (and not even the newest version).
Again. It's not required. They could leave all the viewers with equipment that can only handle MPEG-2 in the dark if they wanted to. They would rather cater to the lowest common denominator, however.
Certainly no broadcaster is going willingly lose viewers by switching to AVC. The FCC mandate for DTV is to replicate NTSC service, so there would still have to be an SD MPEG-2 program in the mux anyway. Your dream of 18 Mbit/s AVC will never happen. Wait for ATSC 2.0 or 3.0.
720p does not have more pixels/second than 1080i. You pulled that one out of thin air.
1280 x 720 x 60000/1001 = 55240759 pixels/sec
1920 x 540 x 60000/1001 = 62145854 pixels/sec
720p has more temporal resolution. 1080i has more spatial resolution. Fox and Disney/ABC/ESPN chose 720p because of sports. By the time you account for kell factor (which I was) and the coding inefficiency of interlaced MPEG-2 (relative to progressive), 1080i doesn't always have the obvious advantage you seem to believe it does. It's highly unlikely your TV has a better deinterlacer than your local station. Even if it did, it would be downstream their horrible, antique relic of an MPEG-2 encoder. For a telecined source (film), 1080i is certainly superior.
Most PBS affiliates run at 11 Mbps. It's pathetic. That disturbingly low bitrate will introduce enough artifacts of its own that it diminishes any quality that may be gained by having a higher source material.
Given a fixed output bitrate, a better input will yield better output. It's irrelevant that the output is 11 Mbit/s (pathetic or not). CBR at that rate is crap, but maybe not if that's the long-term average within a statmux.
Congratulations on being the 1% (of TV snobs).