Resolution Questions

Status
Please reply by conversation.

Gene T.

Member
Original poster
Apr 17, 2007
9
0
I have a Samsung 720P DLP. I was reading that Directv sends all their HD content in 1080i. My 1st question is if my TV doesn't display a interlaced single. What resolution does my tv put out, is it 540P? I basically have the same question with TV station resolutions. CBS and NBC are in 1080i and ABC and FOX are in 720P. So is my 720P Samsung showing a better resolution from ABC and FOX than CBS and NBC if they were over the air broadcasts. If this is the case Is the best resolution I'm getting from Directv for all my HD content 540P?. Which is not really much better than a DVD which is 480P. Just curious
 
The 1080i60 is deinterlaced (line doublers) to 1080p30 and each frame is used twice to get 1080p60 then run through a resize algorithm to get to 720p60.

1080i60 on a 720p display will show more conversion artifacts and motion blur during motion, static images should have the same detail from both 1080i60 and 720p60 content on a 720p display.
 
You tv will upcovert to a 1080i. And I always thought a 1080P was 60 frames per second and 1080I was 30 fps. Meaning 1080i only refreshes every other line at one time ,where 1080p or 720p refreshes every line at once. Maybe I'm wrong?
 
You tv will upcovert to a 1080i. And I always thought a 1080P was 60 frames per second and 1080I was 30 fps. Meaning 1080i only refreshes every other line at one time ,where 1080p or 720p refreshes every line at once. Maybe I'm wrong?

Hemi,
Unless I read your statement wrong.
I believe you are correct.

as for the OP, the TV will do what it can,but it also depends on what output setting he has chosen on his D* reciever.

Jimbo
 
Questions like this are always so hard to answer since it really depends on your TV. I have an LCD that does 1080i and 720p. With that specific TV I find 720p looks the best even if the source is coming in at 1080i. So I would try what Charper1 suggested. Turn native off and keep it on 720p for everything and just evaluate how it looks. If looks good, then enjoy the HD.
 
yep, my projector is a great example. It is 720p native but I have my DirecTV box set to 1080i out only; it just looks best to us. In the bedroom we have an older UltraVison and I have that HD box set to a hybrid mode when it "sees" SD it outputs all 480p and when it sees HD it outputs 1080i.
 
Your tv will upcovert to a 1080i. And I always thought a 1080P was 60 frames per second and 1080I was 30 fps. Meaning 1080i only refreshes every other line at one time ,where 1080p or 720p refreshes every line at once. Maybe I'm wrong?

You got it, 1080i is 30 fps 60 fields per second. 1080i60 and 1080i30 I've seen it both ways, refresh rate and fps. It would have been clearer if it was writen 1080i60@30fps or 1080i30@60Hz. I don't know, but 1080i60 makes more sense to me but 1080i30 is probably more correct.
 
Still Confused

I guess my last question is if I set my Directv HD DVR box to 720P, which is the highest resolution my Samsung DLP can make. What is my resolution on my TV if the singnal coming in is 1080i? My TV can not show interlased singnals. So does anyone know what my resolution would be on my TV. One more question if Directv sends all its singnals down from the satellite in 1080i what are ABC and FOX's resolutions at? I know that they send their signals at 720P. Still curious? Frame rates I am not really intrested in, 1080i is at 30fps and 720P is at 60fps. 1080P is at 60fps and is only for HD DVD right now not broadcasts. Thank you for your help
 
I guess my last question is if I set my Directv HD DVR box to 720P, which is the highest resolution my Samsung DLP can make. What is my resolution on my TV if the singnal coming in is 1080i? My TV can not show interlased singnals. So does anyone know what my resolution would be on my TV. One more question if Directv sends all its singnals down from the satellite in 1080i what are ABC and FOX's resolutions at? I know that they send their signals at 720P. Still curious? Frame rates I am not really intrested in, 1080i is at 30fps and 720P is at 60fps. 1080P is at 60fps and is only for HD DVD right now not broadcasts. Thank you for your help

A native resolution 720x1240 display is fixed and all content will have be converted to 720p to match the display. All LCD, Plasma, DLP, LCoS are called a fixed-pixel micro display, only CRT based displays are not.

720p source up converted to 1080i by the set-top-box to be displayed on a 720p display. A progressive display must update both fields at the same time. However, since each field represents a different moment in time, the moving ball is split across the fields introducing interlace artifacts (i.e. the ball during a football pass will show jagged edge during motion).

Now with 1080i source then processing it to 720p, all depends which (set-top-box or TV) has the better de-interlacer. Some TV's do an better and excellent job at de-interfacing with many options then the set-top-box..

Most likely the TV will do the better job at processing 1080i to 720p for your display, so I would use "native pass through to get the best PQ for 720p and let the TV do the 1080i (source) to 720p (display) conversion, but you'll have to test to verify.

Your best bet is to join others owner's of your display at AVS Forum - Home
and see what’s they think the best setting are or you could test yourself with a test hd-dvd.

HQV - Hollywood Quality Video Processing for HD : Benchmark DVD
Interlacing - Luke's Video Guide
 
Gene,
Let your eyes do the talking.
If you look at it and don't like it, try the other setting.
It all depends on what YOU are seeing, find the one that looks best for your set up and go with that.

Don't sit and nit pick once you have made a choice or you'll never enjoy your viewing experience .

Jimbo
 
Last edited:
I was reading that Directv sends all their HD content in 1080i.
The information you read was either in error, or you interpreted it incorrectely. DIRECTV sends both 1080i and 720p. It is also true that most, if not all DIRECTV 1080i content is reduced horizontally to either 1280 or 1440 before it is sent. Apparently, if it isn't HD sports, it isn't worth the bandwidth.
My 1st question is if my TV doesn't display a interlaced single.
This is incorrect. Your TV cannot display an interlaced picture, but it can display interlaced source material by "scan converting" it to progressive.
I basically have the same question with TV station resolutions. CBS and NBC are in 1080i and ABC and FOX are in 720P. So is my 720P Samsung showing a better resolution from ABC and FOX than CBS and NBC if they were over the air broadcasts.
I can't tell you as I've never seen your TV. If the scan conversion hardware in your TV is any good, either mode should look pretty good.

Over the air broadcast is the reference standard. It isn't always as good as it could be, but re-casting doesn't improve the picture quality.

Understanding it isn't going to make it any better. It can make it worse. Check out any discussion of the Sony Trinitron "wires".
 
Some TV's do an better and excellent job at de-interfacing with many options then the set-top-box..
Don't confuse de-interlacing (automatic with frame storage) with scan conversion. Scan conversion is the process of going from 1080 lines to 720 lines and that happens after de-interlacing.
 
One more question, If the signal that is sent to my TV is 1080i like with CBS or NBC can that be converted to 720P? And if it is converted is it true 720P? like ABC and FOX. I was told that a single coming in at 1080i 30fps, will be converted to 540P 60fps on my TV which is a 720P TV. My TV is a progressive TV so it will not show an interlaced signal it has to convert it to a progrssive signal. So does it always have to convert to 60fps? If it does have to do this then is it true that a signal from a 1080i 30fps source will be converted to a 540P 60fps on my TV. I no this question seems redundant but but I don't seem to be getting a straight anwser on it. I keep getting asked what my TV or Satilite Box is set at what I'm talking about is a outside antenna source from CBS or NBC in High Def which is 1080i 30fps. My TV is set at 720P 60fps. Personally what I thought would happen is that the 1080i 30fps signal would be converted to 540P 60fps because a 1080i interlaced signal only has 540 lines every 60th of a second (60fps) and then they flip flop (flicker back and forth which account for the slower) (30fps rate). How can the TV add 180 lines of resolution at that frame per second rate?? From what I understand 1080i only shows 540 even lines every 60th of a second and then another 540 odd lines in the next 60th of a second which equals 1080 lines every 30th of a second flickering back and forth. So my confusion is that how can a 1080i 30fps signal become 720P 60fps resolution?
 
One more question, If the signal that is sent to my TV is 1080i like with CBS or NBC can that be converted to 720P?
Of course.
And if it is converted is it true 720P? like ABC and FOX.
It won't be as sharp as something that started out as 720p because there is lots of complicated averaging going on to represent 2megapixels worth of information in one megapixels worth of display.
I was told that a single coming in at 1080i 30fps, will be converted to 540P 60fps on my TV which is a 720P TV.
Don't believe everything you are told (or perhaps what you think you were told). If your TV is an LCD or plasma, it probably isn't even a true 720p display!
My TV is a progressive TV so it will not show an interlaced signal it has to convert it to a progrssive signal. So does it always have to convert to 60fps?
Most modern non-CRT televisions actually scan at 120fps. Unless you're bothered by fluorescent lighting, it doesn't really matter what the frame rate of the TV is.
If it does have to do this then is it true that a signal from a 1080i 30fps source will be converted to a 540P 60fps on my TV.
In order for your TV to fill the screen, the signal will be scaled down from 1920x1080i to something that fits your TV's matrix (which may not be 1280x720).

You really need to get over the 540 line thing as it isn't happening. As I cautioned earlier, sometimes knowing too much about how things are working takes away from the enjoyment. If the picture looks good, enjoy it. If it doesn't, acquire a TV that does. Like your line of questioning, the process that you're putting yourself through is a death spiral.

As I alluded to earlier, if your TV is an LCD or plasma, it is entirely possible that its display is not 720 lines tall. Most are of these TVs were actually 768 pixels tall. If you really want to spend some quality time fussing over the details, look into the process of scaling and see how it is done.
 
I stated in an eariler thread that my TV was a Samsung 720P DLP (1280 x 720). If it was a LCD or Plasma I wouldn't have any of these questions. LCD and Plamas can show a true interlaced and a true progressive signal only a DLP cannot show a true interlaced signal it must convert it. Am I correct with that statement? Also are you stating that a signal that is sent in 1080i 30fps from NBC or CBS over the air will be converted to 720P 120fps. This is were I get very confused It's like with a DVD, a normal DVD is burned at 480P 60fps. I always see advertising for DVD players that can upconvert that resolution to near high def quality. Notice the dreaded word (NEAR). I always wondered how a DVD that is burned at a certain level of resolution can be upgraded, it is what it is, anything done by the DVD player to change that original resolution adds artifact. It's sort of like Plasmas that are 1024 x 720 which aren't really true high def monitiors. But they are advertised as such. These questions come forth because I plan to buy a new TV for my basment. It's very dark so a DLP will be very bright down there, and DLP's have the best size to cost ratio. Upstairs in my family room I can see a true difference between High channels. ABC and FOX look better than CBS and NBC. See I know that ABC and FOX come through my TV at a true 720P 60fps resolution for high Def programming. What I am not sure about is what CBS and NBC come through at? Does anyone know what the true resolution and frame rates are from CBS and NBC after a DLP 720P TV converts them. From what I have learned 1080i is less information than 720P. Just the math of it is easy, for a 1080i signal there are 1080 lines of resolution every 30th of a second. For a 720P signal there is 1440 lines of resolution every 30th of a second. If we look at it the other way a 1080i signal has 540 lines of resolution every 60th of second and 720P has 720 lines of resolution every 60th of a second. From what I understand the 720P signal is better and contains more info. But the 1080i signal uses interlacing to trick the eye into thinking it's seeing a 1080P reolution. But a DLP cannot do this, so hence the question thats asked eariler. Finally I can tell from my own observations that CBS and NBC high def programing look a little better than a DVD but neither looks as good as FOX and ABC high def programming.

Quote:
(In order for your TV to fill the screen, the signal will be scaled down from 1920x1080i to something that fits your TV's matrix (which may not be 1280x720).

Remember my TV will only a show a progressive signal. So does it convert this interlaced signal (1920x1080i 30fps) to (960x540P 60fps)? Or does it convert the interlaced signal (1920x1080i 30fps) to (1280x720 30fps)? Hence the original question about frames per second (fps). Maybe the difference I'm seeing between CBS, NBC and ABC, FOX is more based on frame rate than it is based on resolution. Still looking for that answer. Thank You for all your help.

P.S: I don't look at this as fussing I find it very intresting and would just like to understand it better. If you find my questions redundant don't reply to them, thank you anyway for trying though.
 
Last edited:
LCD and Plamas can show a true interlaced and a true progressive signal only a DLP cannot show a true interlaced signal it must convert it. Am I correct with that statement?
No. All non-CRT technologies de-interlace the signal (as I said earlier).
Also are you stating that a signal that is sent in 1080i 30fps from NBC or CBS over the air will be converted to 720P 120fps.
It won't be converted to in image that changes 120 times per second. It will be scanned down the screen at that rate. It will still be a 1080i signal that has been scaled down and de-interlaced.
This is were I get very confused It's like with a DVD, a normal DVD is burned at 480P 60fps.
You should quit while you're behind. A normal DVD is 480i.
I always wondered how a DVD that is burned at a certain level of resolution can be upgraded, it is what it is, anything done by the DVD player to change that original resolution adds artifact.
The 480i signal is scaled up to the appropriate HD pixel matrix.
It's sort of like Plasmas that are 1024 x 720 which aren't really true high def monitiors.
Most non-1080p plasmas are 13nnX768 or some oddball resolution, but your point is made. The assertion that they aren't "true HD" is debatable.
DLP's have the best size to cost ratio.
If you're going for 1080p, the LCoS TVs are very competitive and have no moving parts (except maybe a cooling fan).
What I am not sure about is what CBS and NBC come through at? Does anyone know what the true resolution and frame rates are from CBS and NBC after a DLP 720P TV converts them.
That depends entirely on the magic of the processing system of the TV. Typically, it would be a de-interlaced and scaled version of the 1080i programming. Some televisions are much better than others at scaling and that's really what it comes down to.
From what I have learned 1080i is less information than 720P. Just the math of it is easy, for a 1080i signal there are 1080 lines of resolution every 30th of a second. For a 720P signal there is 1440 lines of resolution every 30th of a second.
The math may look easy, but that is because you're doing it wrong. You've neglected to factor in the horizontal resolution! Except for the DIRECTV brand of 1080i, there is 50% more horizontal detail in the 1080i signal. With DIRECTV, the horizontal detail for 1080i and 720p is the same.
But a DLP cannot do this, so hence the question thats asked eariler.
No, your DLP can't do it. 1080p displays of any technology can do 1920x1080p all day long without scaling.

I would recommend a significantly different approach than the one you're taking:

Visit a storefront that has DIRECTV receivers hooked up to their televisions and compare them; side by side if possible. No amount of reasoning is going to make one TV look better than another. It comes down to the quality of the electronics. DLP is one of the most finicky technologies at the high end because it involves some very mechanical things to make the picture happen. You've got the spinning color wheel and those crazy wobbling (wobbulating) mirrors.

Shopping statistics isn't going to get you where you need to be. You need everyone who is going to watch the TV to watch it in person with the kind of material that you're going to watch. Only then will you know for sure what TV is right.
 
Now I am confused I thought the 720 and the 1080 were the horizontal resolutions but 1 is scanning quicker than the other. On the progressive your seeing all 720 lines every time. On the interlaced your only seeing half the lines 540 at the same speed your seeing the progressive. Did you mean the Vertical resolutions which are 1920i for interlaced and 1280P for progressive. So I am to believe that no one knows the resolution of CBS, NBC on a Samsung DLP 720P after it converts the 1080i 30fps resolution. I know it can't be 720P 60fps because if that were the case the video would look the same for CBS,NBC,as it does on ABC and FOX but THEY DON'T LOOK THE SAME, CBS and NBC are inferior. And again I don't know why I have to keep saying this I'm only talking about over the air anntenna high def broadcasts not direct tv signals. So after it is scalled down-and de-interlaced what are the actuall Resolutions of CBS and NBC on a Samsung 720P DLP? Are you telling me that the resolutions of CBS and NBC on all DLP's are different due to the scalling down an interlacing Algorithms of each individual DLP TV? Thers no way to know what the actuall resolution is and frame rate? Does my Samsung 720P always have the same 60fps frame rate for video or does it change? To see were I have learned some of my info go to this web site.

720 > 1080

This web site mite help you better understand were I'm coming from. I have read in many jounals that CBS and NBC might be switching to 720P one of the reasons this didn't happen at first was because General Electric which is owned by NBC still had many CRT TVs in production and for sale so NBC wanted to stay with an interlaced signal a little longer. We already know that 1080P is considered much better than 1080i. As is 720P. So I am just wondering if I should buy another DLP and hope they change over, I want to get a 71" TV, Plasma and LCD's that are 1080P are still over 5,000 for >60" but the DLP's are around 2,000-2,500 half price you can see my dilemma. I find it funny that it seems impossible to find out the resolution and frame rate after conversion from interlaced signals on DLP's. Thank you for your help check out that web site, see what you think.

P.S.: I am still not sure about your rational for 1080P DLP TV's.

Quote:
(No, your DLP can't do it. 1080p displays of any technology can do 1920x1080p all day long without scaling.)

Yes they can do 1920x1080P without scaling, who cares? no one is broadcasting 1920x1080P thats only on high def DVD's. Again I'm talking about over the air broadcasts of high def programing. Which only come in 2 resolutions 1080i and 720P. Actually a high def DVD looks great converted down from 1920x1080P to 1280x720P on my TV I have a Playstation 3 and have many 1080P high def DVDs. But remember a 1080P DLP must still convert a 1080i signal from CBS and NBC the same way, it is still an interlaced signal which must be converted. It is in that conversion that the quality of the programing suffers on DLP's. That is why I was wondering how far the quality is decreased hence the question of were the final resolution is at after conversion, scaling down, de-interlacing ect.... But this seems to be the toughest question in TV to get an answer on. I know that I am not seeing 1280x720P 60fps on CBS and NBC, I am seeing some type of hybred resolution. You stated you KNEW it can't be 960x540P, I don't know how you can make such a strong statement about conversion resolutions and not know what the final resolution is. I can tell you by looking at my TV ABC and FOX look like 1280x720P and CBS and NBC look less than that which could be 960x540P and a DVD looks less than CBS and NBC like 854x480P. I'm not positive about those numbers but the clarity of the pictures look like they decrease about that much per signal. Like I said CBS and NBC don't look much better than a DVD. But ABC and FOX look much more defined. All 3 of my signals go through HDMI. As far as looking at TVS at stores it's totally redundant. They have different sourse material, different types of cables Component vs. HDMI, different levels of cables Monster vs. Cheap cables. Some of the TV's have different settings set in them. Some have the brightness adjusted differently, in DLP cases some have stronger or older bulbs which will effect lumens and clarity, some even have very different length of cables which effects clarity, it's totally impossible to get a real defintive assessment of different tv's outside a lab enviroment. So I find that type of reasoning and or shopping redundant. I just wanted to know what the conversion resolution was since you stated it COULD NOT POSSIBLY BE 540P. Thank You for your help. It also seems crazy that CBS and NBC went with an interlaced signal when all the new TV's after CRT's aren't able to show a true interlaced signal like a CRT can. I thought LCD and Plasma TV's could show a true interlaced signal, I thought they were both Progressive and Interlaced unlike DLP's which are only progressive. So a Plasma/LCD can't flicker back and forth like a CRT? Boy I am gone on this stuff hey that's why I'm asking the questions. Thanks Again
 
Last edited:
On the interlaced your only seeing half the lines 540 at the same speed your seeing the progressive.
On your television, all you ever see is presented in a 1280x720 pixel matrix. It doesn't matter what the source material is (obviously SD will have bars on the sides).

You would do well to forget about lines and start talking about rows and columns instead. Lines are something that don't really exist in the digital domain (although a good TV will make the rows and columns of pixels look continuous).
Did you mean the Vertical resolutions which are 1920i for interlaced and 1280P for progressive.
No. I meant that there are 1920 vertical columns of 1080 pixels with 1080i and 1280 vertical columns of 720 pixels with 720p.
So I am to believe that no one knows the resolution of CBS, NBC on a Samsung DLP 720P after it converts the 1080i 30fps resolution.
The resolution at which all programming is displayed is whatever resolution your TV is. EVERYTHING gets scaled to that resolution. Motion processing circuitry handles the frame rate differences and attempts to remove the artifacting that may result.
I know it can't be 720P 60fps because if that were the case the video would look the same for CBS,NBC,as it does on ABC and FOX but THEY DON'T LOOK THE SAME, CBS and NBC are inferior.
Don't tell us what you know as your track record hasn't been particularly good. What counts is your perception of the picture quality and no amount of numbers and long-winded justifications should change that.
And again I don't know why I have to keep saying this I'm only talking about over the air anntenna high def broadcasts not direct tv signals.
Perhaps because you posted your questions in the DIRECTV HD Discussions forum?
So after it is scalled down-and de-interlaced what are the actuall Resolutions of CBS and NBC on a Samsung 720P DLP?
The pixel display matrix of your existing TV is, was and always be 1280x720 regardless of the source material. You cannot change that. You could go to an outboard processor that may do a better job of scaling and motion compensation.
Are you telling me that the resolutions of CBS and NBC on all DLP's are different due to the scalling down an interlacing Algorithms of each individual DLP TV?
Ah, the light comes on. The resolutions don't change so much as your perception of the quality changes.
Thers no way to know what the actuall resolution is and frame rate?
The key is in understanding that resolution and frame rate aren't as important as what it looks like to your eyes. Your perception of quality CANNOT (and should not) be determined by specifications.
Does my Samsung 720P always have the same 60fps frame rate for video or does it change?
Your TV always displays a 1280x720 pixel image at about 120fps. This is important because the frame rate of source material varies from 24fps to 60fps.
To see were I have learned some of my info go to this web site.
With all due respect to Alvy Smith, he doesn't see the world through your eyes. Don't take it too seriously other than to use it to understand the importance of motion compensation capability.
I have read in many jounals that CBS and NBC might be switching to 720P one of the reasons this didn't happen at first was because General Electric which is owned by NBC still had many CRT TVs in production and for sale so NBC wanted to stay with an interlaced signal a little longer.
You read way too much. Talk of CBS and NBC moving to 720p is not only untrue, but it is more likely that everyone will move towards 1080 to take advantage of the new HD disc formats. Most of this is conjecture and rumor based on an incomplete understanding of what's going on.
...I want to get a 71" TV, Plasma and LCD's that are 1080P are still over 5,000 for >60" but the DLP's are around 2,000-2,500 half price you can see my dilemma.
Here's another case of your trying to understand things where it doesn't really matter. Chances are pretty good that any 70" TV that you buy today will be a 1080p model.
I find it funny that it seems impossible to find out the resolution and frame rate after conversion from interlaced signals on DLP's.
It isn't impossible -- it just doesn't weigh enough into the equation when factored in with everything else.

I suggest that rather than fussing over specifications and mathematics, that you plant yourself in front of a number of affordable TVs in the size range that you seek and see how you like them. Test driving is a much more comprehensive test than all of the numbers and theories that you can assemble.
 
Still confused but it doesn't matter. I know my tv always shows the same pixal count. It's the resolution (clarity) that differs. I believe that my tv can scan at 120fps but that doesn't matter if the the info coming in is at 60fps or 30fps. My thinking was that my tv would take the 1920x1080i 30fps and convert it to 1280x540 60 or 30fps. And that is what it looks like. What I think is happening, is that I am seeing Half the lines at 30fps or 60fps when it de-interlaces, converts the 1080i signal. Those "colums" of 1280x540 30fps or 60fps are just stretched to fill my TV pixel matrix which is 1280x720. While when I'm watching ABC or FOX no strecting is needed they send the proper resolution to fill my screen perfectly. CBS and NBC do not. As far as the networks swiching to 1080P your going to wait a long time for that. It's so much bandwidth it's not worth the compression the signal must go through. If that were the case why did ABC and FOX go with 720P and not just go to 1080P. I have read it could be 10-20 years before they go to 1080P for resolution from the networks. The reason why I originally posted the message on the Directv message board was because I had read that Directv sent all there programming down from the satilite in 1080i. But you stated that they send it in 1080i and 720P I was told wrong. In conclusion I'm not the only person that sees a difference between ABC, FOX and CBS, NBC. Other friends have DLP's and see the difference, but other friends that have LCD and Plamas don't see as much difference. It just doesn't make much sence. My wife also sees the difference. We both think the over the air high def broadcasts on ABC and FOX look better then they do on Directv HD. Who knows what conversions directv is doing I remember reading about HDLite ect...... In the end I won't argue that my TV is showing a 1280x720 pixal matrix what I'm stating is that some high def programing is being distorted and some is pristine. And there is a difference. "Perception" is a weak word it's not my perception of the clarity. You have even stated that there are all sorts of conversion algorithms ect... Those aren't preceptions those conversions decrease the clarity of the images shown that's not a percieved decrease in clarity, it is a decrese in clarity which would be FACT not PRECEPTION. Can you also tell me were in TV broadcasting were video is sent at 24fps I thought only film was at that speed. I thought all video on tv since the 70's was at least at 30fps. Very Confused??
 
Last edited:
Status
Please reply by conversation.

HR20 Channel entries randomely ignored

My Locals! grrowl

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)

Latest posts