corvettelover Posted March 17, 2010 Share Posted March 17, 2010 okay, I got this LCD HDTV: http://www.walmart.com/ip/Philips-32-Lcd-Hdtv/10965942 and I got a xbox 360 (the newest model, with 512mb) and a 1.3b HDMI gold plated cable, I am wondering, is 1080i better than 720p? the games look much nicer with the HDMI cable at 720p, but I would like to know if it would be better running at 1080i instead of 720p. games we have (note: i dont play the some of them): Grand Theft Auto Episodes from Liberty CitySaints Row 1 Tomb Raider Underworld GForce Forza 2 Call of Juarez Bound in Blood Blazing Angels GRiD Terminator Salvation Spiderman ultimate allience (or something like that) TV specs from walmart (if ur too lazy to click the link): # Philips 32" LCD HDTV:Brilliant picture quality # Pixel Plus HD for better details, depth and clarity # HD LCD display, with a 1366 x 768p resolution # Dynamic contrast for incredible rich black details # Hear crisp and clear sound # 2x 10W RMS audio power # Dolby Digital output for connection to a home theatre system # Virtual Surround Sound for enhanced sound # Easy to connect and enjoy # Settings assistant for effortless personalized TV settings # 3 HDMI inputs with Easylink for HD connection # Ready for digital # ATSC & QAM tuner receives over the air and unscrambled cable Link to comment Share on other sites More sharing options...
dice Posted March 17, 2010 Share Posted March 17, 2010 1080i is like 540p because it can display 540 (horizontal square) lines of display resolution, where 720p displays all 720 line at any given time. But since these TVs come in 16:9, 1080i actually has more pixels (1920 * 540 (1080i) > 1280 * 720) which means better quality. And when converting ordinary movies to full HD, you get a 1080p quality movie but with half less fps (~30), which is like the same as watching television. The only difference is that the real 1080p has twice the amount of frames Link to comment Share on other sites More sharing options...
Otter Posted March 17, 2010 Share Posted March 17, 2010 (edited) Ignore everything he just said. You'll probably have better quality if you use 720p, and like you said, they look better. It's far closer to the native resolution of your TV (the great equalizer in this equation). 1080i is best kept as a broadcasting format these days. Some games may make better use of 1080i by essentially encpsulating a progressive signal in the interlaced format (IE, both fields combine to make one progressive frame) but even this is not ideal compared to 720p on a TV like yours, as fine details may be lost in the scale-down process. Edited March 17, 2010 by Otter Link to comment Share on other sites More sharing options...
dice Posted March 17, 2010 Share Posted March 17, 2010 May I be informed as why is my post totally wrong ? Link to comment Share on other sites More sharing options...
Otter Posted March 17, 2010 Share Posted March 17, 2010 (edited) Sure. 1080i is like 540p because it can display 540 (horizontal square) lines of display resolution, where 720p displays all 720 line at any given time. Semantics, I'm sure, play a part here, but you're missing the point. 1080i, typically, is 60fields per second. That's 30 frames composed of 2 fields each. If, like I said earlier, both fields are the same frame temporally, then essentially you're dealing with a full "1080p" frame. If the fields are not the same temporally, as is usually the case, the image is not '540p'. That's like saying an interlaced DVD is '240p'. But since these TVs come in 16:9, 1080i actually has more pixels (1920 * 540 (1080i) > 1280 * 720) which means better quality. You're forgetting that his TV acts as the bottleneck here. The maximum horizontal resolution he's got here is 1366. So 720p is 921,600 pixels. For 1080i, if we assume that the signal is interlaced and the set has to throw out half the 1080i frame, then we get 1366x540 which is only 737,640. And when converting ordinary movies to full HD, you get a 1080p quality movie but with half less fps (~30), which is like the same as watching television. No idea what you're talking about here. Upconverting a DVD to 1080i doesn't reduce the frame rate. A film is typically 24 frames per second. When the signal is converted to 1080i, a pulldown algorithm is applied to actually increase the frame rate and create a 30fps signal. Alternatively, the film can be sped up by 1% to create a 25fps "progressive" signal - but this is not the North American norm. 720p, however, under the ASTC standard, can maintain the proper framerate. The only difference is that the real 1080p has twice the amount of frames Actually, under the ATSC standards, 1080i and p have the same maximum frame rate - 30. However, 1080i at 30fps can be composed of 60 fields per second, and this is where the add fluidity of motion argument comes from. However, 720p is fully capable of a progressive 60 frames per second under the standards. It is also worth noting that current hardware in most cases can display 1080p at 60 progressive frames per second, or even higher, despite the fact that it is not part of the ATSC standards. Edited March 17, 2010 by Otter Link to comment Share on other sites More sharing options...
Fozzy Fozborne Posted March 17, 2010 Share Posted March 17, 2010 I've got a 720p TV and to me it looks way better at 720 than 1080i. Why not try it out yourself and see the difference. @Otter, and you wonder why people get confused... Link to comment Share on other sites More sharing options...
Otter Posted March 18, 2010 Share Posted March 18, 2010 I've got a 720p TV and to me it looks way better at 720 than 1080i. Why not try it out yourself and see the difference. @Otter, and you wonder why people get confused... f*ck, I tear my dick off pretty much every time I have to output for a new client. Last project I did was actually something like 14800x32. Insanity. Link to comment Share on other sites More sharing options...
okei Posted March 19, 2010 Share Posted March 19, 2010 erm i may be wrong here, but dont 720p televisions support 1080i or have a 1080i mode at least? because i know mine does Link to comment Share on other sites More sharing options...
Otter Posted March 19, 2010 Share Posted March 19, 2010 (edited) Yeah, the question here is what looks better in the end on a display like that: a 720p signal, or a 1080i signal. From my experience, and for the reasons posted above, I say 720p. An exception, however, may be when watching SD digital cable channels. Personally, my cable box f*cking sucks at deinterlacing those signals when I'm using 720p mode. They can look far better if the the TV deinterlaces them instead. This is all relative to hardware, however. Edited March 19, 2010 by Otter Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now