Sorry, not incorrect. If I set my receiver to output the native resolution of the signal it gets sent to the TV that way and then my TV upconverts it to 1080i. If I just set 1080i on the receiver then all signals are upconverted to 1080i before they are sent to the set and then of course they display at 1080i. If the upconversion in the receiver is as good as the upconversion in the TV of course the two pictures look the same. They are both upconverted to 1080i, its just the upconversion is done in a different place. If you want even better upconversion you can buy a specialist video processor/scalar like DVDO and put it between the receiver and the TV. You output from the receiver at the native resolution and the scalar then upconverts it to whatever resolution is the best one for the TV to receive.
This whole issue of upconversion is why the only way to decide what resolution is best to output from your set-top box to the TV is to try all the options.
Upconverting to 1080i is NOT the same as creating an HD picture, I think you are confusing the two things. You can't create detail that was not there before. What the upconverter does is try to "fill in the gaps" based on what it sees, in order to produce a signal that is best matched to the display. You tend to get a picture which is much cleaner and crisper than the original, while obviously not HD.