if there really is no difference
There
could be differences (see *NOTE* below).
...why do they make both kinds
Capabilities of the silicon used: 1080i is old and cheap, 1080p - new and expensive.
...how can they get away with charging so much more
For the same reason Porshe/Ferrari/Bently/etc. are still in business.
...and why do so many supposedly knowledgeable people on this forum go for the more expensive 1080p models?
Because they have the money? Have golden eyes? Want bragging rights?
*NOTE*
Let's consider a motion picture originally made on film (not anime, not video). It is shot 24 frames per second (24fps). Those frames are scanned and
processed. Every studio does this processing their own way and it is not that important (because they will never tell you what they do). What is important, compressed 4:2:2/10bit AVI file arrives at the encoding facility (Warner and Sony do encoding in-house, the rest of the studios - outsource). At this point decisions are made about what audio/video codecs to use, what bitrate is available (audio and video are dealt with separately) and the rest (PiP, bonus material, games, etc.)
The movie is converted to 4:2:0/8bit and encoded using the chosen codecs as 1080/24p, i.e. each frame is encoded as progressive (non-interlaced) and there are 24 of those per second.
The process described above is a bit simplistic but exactly the same for both HD/BD.
The differences start when you look how those bits are stored on the disc (leaving aside the AACS, BD+, etc. for the moment).
Both formats have the encoded bits stored as progressive frames, but HD has them flagged, i.e. instruction how to deal with the stream when it is played by a 1080i player. No other differences. This is why Warner just uses an utility to convert their HD encodes into BD encodes (attaching flags).
If both, your player and display, can handle 1080/24p - everything is plain and simple (and identical in both formats, HD/BD). The flagging in HD gets ignored, the bits are pulled from the disc and unmolested (hopefully) displayed on your TV/projector of choice.
When the player and/or the TV cannot handle 1080p, each progressive frame is taken apart into two interlaced fields (the process is called telecine, 1080/24p -> 1080/60i) and transmitted separately. Before displayed, those two fields are assembled back into a progressive frame (the process is called IVTC, inverse telecine).
If every step works as designed, no information gets lost, 1080p and 1080i transmission and presentation ends up being identical as seen by the viewer.
What can go wrong? Or better, Who can screw up? Both, player and TV. The difference is, HD has one more "point of failure" when the transmission is done in 1080/60i.
We, consumers, are never told what players/TVs do with the signal. As has been reported recently, many of them don't do a proper IVTC.
Just look at Secrets' review of regular DVD players and what can go wrong.
Even fewer TVs do proper chroma upsampling (CUE) - nothing to do with interlaced/progressive. Another can of worms is over- underscan.
How many TVs that accept 1080p can actually do 1080/24p? If they do only 1080/60p, the 3:2 pulldown is applied - another source of errors.
How many TVs can display judder-free 1080/24p? According to Gregg Loewen,
not a single LCD TV on the market can do this at the moment!
I've seen a Sony Ruby and JVC RS-1 (both projectors) fed 1080/24p video stream and displaying them at 72Hz. No judder! That is noticable! Much more than anything else.
Bottom line: if judder doesn't bother you (and there is a 99%+ chance that your setup can't display a judder-free picture), you won't notice anything else.
Diogen.