resolution

Status
Not open for further replies.

carlosg

SatelliteGuys Family
Original poster
Aug 24, 2004
83
0
Thinking of getting a lcd tv but the ones I saw had a resolution of 1366-786. So does that mean if I watch something 1080 will this tv down grade the picture.
 
Yes, it will. Though it will still look spectacular! No comparison with SD! And 786p is still better than 720p. ;)
You can get a 1080p set if you really want, but those are much more expensive.
 
Actually not at all.
There's no 1080p broadcast whatsoever - only 1080i which is not better: it's interlaced. One would argue that spatial resolution is better but what's up with temporal resolution? Why stucked with interlaced?
Nah, 1080i is nowhere better than 720p and keep in mind, interlaced is HORRIBLE for sports.
Interlace sux, a bad tasted legacy of the beginning ot television, we have to get rid of it forever sooner or later.

PS: If he buys a 1080p that's nice but it'll simply deinterlace the 1080i broadcast and upconvert the 720p broadcast.
 
I am not going to start another 720p vs. 1080i debate here, don't even try that! :D
But if the source is 1920x1080i and you are showing it on a 1366x786 display, technically speeking, you are losing the resolution even though you are deinterlacing the image in the process! And if you watch the same content on a 1080p monitor you will see a difference!

My point is that both 786p and 1080p can look spectacular with a good source, and depending on one's budget this may or may not be worth the price difference. Compared to what D* (and now E*) are doing in downrezing the picture to HD-Lite, the difference between 786p and 1080p may not be so important.
 
T2k said:
Actually not at all.
There's no 1080p broadcast whatsoever - only 1080i which is not better: it's interlaced. One would argue that spatial resolution is better but what's up with temporal resolution? Why stucked with interlaced?
Nah, 1080i is nowhere better than 720p and keep in mind, interlaced is HORRIBLE for sports.
Interlace sux, a bad tasted legacy of the beginning ot television, we have to get rid of it forever sooner or later.

PS: If he buys a 1080p that's nice but it'll simply deinterlace the 1080i broadcast and upconvert the 720p broadcast.
Kind of funny considering your signature:rolleyes:
The debate over 720 vs 1080i is nothing to get into a tizzy about.
Bottom line -720p is better for sports and 1080i is better for movies. NOT that it really matters since most CRT based rear projetion TV's are 1080i and the rest- DLP,LCD and Plasma, are 720p .
Add to the mix that ABC, ESPN and Fox are 720p and ALL the rest are 1080i - So either way your converting something to match your display since no display is able to show both formats natively.
 
Do you do TRUE HDTV?
Evan Powell

Have you heard this one yet from a projector salesman…. "You don't want to buy THAT projector…it doesn't do TRUE HDTV." Well, certainly nobody would want to buy a projector that didn't do real HDTV, right? But they all claim to do HDTV. So what's the scoop?

It is easy to understand why the confusion exists. But it is also easy to sort it all out. First, let's start by defining HDTV. There are two common HDTV formats in use today, usually referred to as 1080i and 720p. The numbers refer to the number of horizontal lines in each frame of video (also known as "vertical resolution" since it is the number of horizontal lines as counted vertically from top to bottom of the screen). So in a 1080i signal, there are 1,080 lines per frame of video, and in a 720p signal there are 720 lines per frame.

The "i" and "p" indicate whether the signal is interlaced or progressive. In an interlaced signal, all of the even numbered lines are transmitted in one batch, followed by all of the odd numbered lines. (This is done to reduce transmission bandwidth.) In a progressive signal, all lines of the frame are transmitted at once in sequence. So with the interlaced 1080i signal, only 540 lines are recorded by the camera and transmitted at a time; they are then reassembled at the time of display. Meanwhile, with 720p, all 720 lines are recorded and transmitted in sequence.

Both of these signal formats maintain a 16:9 aspect ratio. That means the picture is 16 units in width for every 9 units in height. This is what has become known as the standard widescreen television format—all widescreen format TVs, plasmas, and projectors have a native 16:9 aspect ratio these days.

In order for an HDTV signal to maintain a 16:9 aspect ratio that matches the widescreen format, it needs to have 16 pixels on each line for every 9 lines of video in the frame. So a 1080i signal has 1920 pixels horizontally. That is why you will sometimes see the actual resolution of the 1080i format designated as 1920x1080. (If you divide 1920 by 16, then multiply the result by 9, you get 1080.)

Similarly, a 720p format signal has 1280 pixels on each line. So the physical resolution of the 720p format is often noted as 1280x720. (Once again, if you divide 1280 by 16, then multiply the result by 9, you get 720.)

So far, so good. Now….what is TRUE HDTV? This is where it gets confusing, because people use the term to mean different things. Some people think that the only real, legitimate HDTV format is 1080i because it has the highest physical resolution. So they refer to 1920x1080 as true HDTV. Others have been calling 1080i "full HDTV," presumably to distinguish it from the less full 1280x720.

Fans of the 720p format object to this. They point out that progressive scanning produces a cleaner, higher resolution signal when the subject is in fast motion. It has no deinterlacing fuzziness. And since the 1080i camera captures only 540 lines at a time, the actual resolution of 1080i when the subject is in motion is only 540 lines, not 1080. So many folks think 720p is better for rapid motion sports like football and soccer, while 1080i is better for, say, golf, where people are just basically standing around.

The fact is that both 1080i and 720p are great HDTV formats that look a lot better than standard television. Both formats are being broadcast by the major networks today, so your projector needs to be able to display both of them, and all projectors that are HDTV compatible do in fact display both of them.

So what does it mean to ask "does your projector display true HDTV?" Often what is really meant is, "does it need to re-scale the image?" In other words, does the video information coming in on the HDTV signal need to be either compressed or expanded to fit the physical resolution of the projector? In most cases, it does.

Any given projector has just one physical resolution, usually called the native resolution. Native resolution is the number of pixels actually available on the display. So an SVGA projector, for example, has display panels or chips with a native 800x600 pixel matrix. In order to display a 16:9 signal, it uses an active area of 800x450 on the display. So any HDTV signal that it gets, whether it is 1280x720 or 1920x1080, it must reformat (compress) that incoming signal into 800x450 before feeding it to its internal display. So no matter what, it cannot display any HDTV signal without compressing it, and losing a bit of image detail in the process.

This is true of standard XGA resolution projectors as well. They have a native resolution of 1024x768. In order to display a 16:9 image, they use an active portion of their display that is 1024x576, which is a 16:9 matrix. Therefore the HDTV signals, whether 1920x1080 or 1280x720, must be compressed to fit into a 1024x576 matrix before they are displayed.

Many new home theater projectors have native 1280x720 LCD panels or DLP chips. These are built expressly for the purpose of displaying HDTV 720p without needing to compress it or expand it. Some would say that projectors with the 1280x720 matrix are true HDTV projectors. However, some wouldn't, because when they get a 1080i signal these projectors still need to compress the 1920x1080 information into their native 1280x720 displays.

For the purist with unlimited funds, the only real, genuine HDTV projector is one with 1920x1080 internal resolution. These will display 1080i without any compression. There are a small handful of projectors on the market with this resolution, and at the moment they cost $20,000 and up. But these units need to reformat 720p signals, scaling them up to fit their native 1920x1080 displays. Technically, then, you could say that even these units are not true HDTV when it comes to 720p format.

The bottom line is that all projectors are built to scale a wide variety of incoming signal formats into their one native display. They will all do standard television, they will all do DVD, and almost all of them will do HDTV 1080i and 720p as well. In addition, most of them will display a variety of computer resolutions, including SVGA, XGA, and so forth. Really, when it comes to HDTV, there are only two circumstances where scaling is not required: 720p for a projector with 1280x720 native resolution, and 1080i for a projector with 1920x1080 resolution. Other than for those two unique matches, scaling is always required no matter what.

So this whole issue about "true HDTV" misses the point. Even the cheapest low resolution projectors will display HDTV pictures that look better than any television you ever saw. The fact that you are seeing a compressed signal is quite beside the point. Scalers have gotten so good these days that even low resolution projectors deliver amazing HDTV quality for the money, even after the compression. So who cares if it isn't "true HDTV?"

The real question is how much are you willing to spend on a projector? Generally, the projectors with higher native resolutions tend to cost more than those of lower resolution. With higher resolution you get reduced pixelation, and usually a smoother, cleaner, more filmlike image. And you usually get these improvements, to varying degrees, on all video sources whether they be television, DVD or HDTV. Getting better image quality across the board is usually a more important key to your overall viewing satisfaction than the question of whether the HDTV image is scaled or not.

Yes, it is true that today's 1280x720 format projectors are indeed particularly impressive for 720p display. But the amount of 720p material you will view compared to everything else will probably be rather small unless your weekly video entertainment consists mainly of HD sports broadcasts from ABC, ESPN, and Fox. And meanwhile, 1080i can look spectacular on a 1280x720 projector, even though the 1080i signal is compressed and not "true" 1080i.

Therefore, next time a salesman says, "Don't buy that projector, it doesn't do true HDTV," think twice and don't take his word for it. That relatively inexpensive projector you are considering just might deliver the best possible HDTV picture for the money on the market.
 
Kevinw said:
Kind of funny considering your signature:rolleyes:

IIRC I never said 1080i is not gorgeous compared to 480i or even 480p? :rolleyes: One would think it's obvious... 1080i is much better than any non-HD resolution but not better than any progressive HD resolution. :p

BTW I have a Z1U which is 1080i - why? Because A) there wasn't any prosumer (sub-$10K priced) 3CCD 720p available last Spring B) the old JVC HD10U wasn't really useful without proper studio lightning and C) the new 720p HD100 just came out few months ago and still doesn't have p60 recording mode (though can output uncomp 720p60 and I've seen some new laptop-based tool to record it... mmmm...)

The debate over 720 vs 1080i is nothing to get into a tizzy about.
Bottom line -720p is better for sports and 1080i is better for movies.

Exactly why would be 1080i better for anything?

I'm very curious the reasons behind this "1080i is better for movies" statement...

NOT that it really matters since most CRT based rear projetion TV's are 1080i

Correct.

and the rest- DLP,LCD and Plasma, are 720p .

Or 1080p, as all the new modell lines are showing. ;)

Add to the mix that ABC, ESPN and Fox are 720p and ALL the rest are 1080i - So either way your converting something to match your display since no display is able to show both formats natively.

Well, 1080p display is still the best case: it either upconverts 720p (best possible PQ) or deinterlaces but not both like any other (720p or 1080i) would do. :cool: ANd it is the most future-proof as well, that's granted. :)
 
Last edited:
Wow took the Sig thing a little personal Huh.. In one statement 1080i sux but in the sig its fabulous.. no need to get defensive about it just funny...Call it my bizrre sense of humour:)

I almost agree with you, if I was to buy a new set today get a 1080p. But then if I was not about to replace one yet, I would wait untill 1080p is actually broadcasted and get a model that accepts a 1080p signal vs converting the current fromats to it.

Now back to the original topic, dollar and cents wise get what fits your budget , which in todays worl of best bang for the buck is a 720p display.
 
carlosg said:
Thinking of getting a lcd tv but the ones I saw had a resolution of 1366-786. So does that mean if I watch something 1080 will this tv down grade the picture.

Yes it will down convert 1080I to 768P and up convert 480I to 768P .
 
Status
Not open for further replies.

Hitachi 57s500

Resolution Help

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)