# New 1080p tv - output 720p or 1080i?



## zooie123 (Feb 22, 2006)

Sorry to bother you but my new tv is coming today and I have a question. It is a Samsung HLR6168, 1080p. Do I make my DirectTV HD Dvr output in 720p or 1080i? I had a 720p tv and it was easy, output in 720p. Now with new tv, what should I do. I know it will upscale everything to 1080p, but logic tells me 1080i will be better, but I just don't know.Thank you for any help you can give.

Zooie


----------



## kimsan (Jan 23, 2002)

zooie123 said:


> Sorry to bother you but my new tv is coming today and I have a question. It is a Samsung HLR6168, 1080p. Do I make my DirectTV HD Dvr output in 720p or 1080i? I had a 720p tv and it was easy, output in 720p. Now with new tv, what should I do. I know it will upscale everything to 1080p, but logic tells me 1080i will be better, but I just don't know.Thank you for any help you can give.
> 
> Zooie


Only Fox and ABC (of the HD channels D* carries) are 720p. The rest are 1080i (HD-Lite with the horizontal squish). The up-converter in the HR10 is pretty durn good. Let it do the resizing to 1080i and the TV deinterlace to 1080p. Feed the TV all the lines you can get!

That's what my head says. Try both and use your eyes to make the decision.

The *best* solution would likely be native pass-through, but we just don't have that option.


----------



## stevel (Aug 23, 2000)

You want 1080i. If the TV does its job right, it will deinterlace the signal perfectly to produce 1080p for the panel.

There's an interesting article in the March Home Theater Magazine with tests of a bunch of TVs to see which do interlacing right and which "cheat" (by taking one 540 line frame and doubling it). The Samsung HL-R6167 (a 720p model) failed the test, the 6168 is not listed. However, the 5668, which I assume is a smaller version of the 6168, is listed as passing the test.

I was glad to see that my Sony KDS-R60XBR1 also passed.


----------



## AbMagFab (Feb 5, 2001)

Absolutely 1080i. Otherwise you've mostly wasted your 1080p set. You want all the pixels to make it to your TV.

(Ad BTW, even for a 720p set you might want 1080i, as the more detail you send to the set, the better it looks, and the set does a better job at downrezzing than the HD Tivo - IMO.)


----------



## bpratt (Nov 20, 2004)

> Absolutely 1080i. Otherwise you've mostly wasted your 1080p set. You want all the pixels to make it to your TV.


That logic would work fine if the stations would broadcast the defined number of 1080i pixels in the HDTV standard. As can be seen in the chart from the following web site, 1080i as practiced is fewer pixels/second than 720p. Also, the amount of compression being done has an effect on the picture quality. 
http://www.alvyray.com/DigitalTV/DTV_Bandwidths.htm

The best way to determine what to set your HR10-250 output at is to try it on your TV and see which looks best to you.


----------



## vertigo235 (Oct 27, 2000)

My TV is native 1080i, I tried sending 720p to the tv for a while for those stations that were native 720p and could see no difference. So now I leave it at 1080i, I'd say there is no reason to use 720p if you have a 1080i or p set.


----------



## AbMagFab (Feb 5, 2001)

bpratt said:


> That logic would work fine if the stations would broadcast the defined number of 1080i pixels in the HDTV standard. As can be seen in the chart from the following web site, 1080i as practiced is fewer pixels/second than 720p. Also, the amount of compression being done has an effect on the picture quality.
> http://www.alvyray.com/DigitalTV/DTV_Bandwidths.htm
> 
> The best way to determine what to set your HR10-250 output at is to try it on your TV and see which looks best to you.


I don't get why so many people get confused about this.

Resolution = # of pixels in an image, or the dimensions of the image. 1080i is 1920x1080. The only way to see each pixel in a 1080i image is to set your HR10 to 1080i.

Bitrate, or bits per second, has to do with compression and ultimately compression artifacts. In simple terms, it has to do with how many pixels change per unit of time. While bitrate effects the *quality* of the picture, it does not reduce the *resolution* of the picture.

Low bitrates don't always mean bad picture quality, either. If you have a slowly changing image, you don't need a high bitrate to have a high quality picture. But rapidly changing images need a higher bitrate to keep up.

Got it?

In other words, if you want to see every pixel of a 1080i source, you need to use 1080i on your HR10 (or other receiver). As for the quality of the material and the amount of compression, that's independent of the resolution you are viewing, and won't impact the decision for the setting you use.


----------



## TyroneShoes (Sep 6, 2004)

bpratt said:


> That logic would work fine if the stations would broadcast the defined number of 1080i pixels in the HDTV standard. As can be seen in the chart from the following web site, 1080i as practiced is fewer pixels/second than 720p. Also, the amount of compression being done has an effect on the picture quality.
> http://www.alvyray.com/DigitalTV/DTV_Bandwidths.htm
> 
> The best way to determine what to set your HR10-250 output at is to try it on your TV and see which looks best to you.


I agree with the last statement wholeheartedly. I have a 768-native Sony, and while that means that 1080i content output at 1080 would still have a potential for more rez than 1080i content output at 720, 720 actually looks best. This may be because the deinterlacer on the HR10 does a better job, I don't know (most 2005 Sonys do not deinterlace all that well, even though they normally have better PQ than a lot of sets that do deinterlace well, even for 1080i content). Whatever is at work here, diagonal lines have fewer jaggies, and only on the rarest of occasions does the actual achieved rez in 1080i reach a level that makes a difference between setting the HR10 for 1080 or 720. I'll opt for the tiny potential loss of rez over the constant bad deinterlace anytime, so I stay at 720 virtually all of the time.

What I have a problem with is the vague description in the link regarding 1080i "as practiced". I don't for the life of me understand what this refers to. All I can tell you is that virtually any TV station that broadcasts 1080i content is sending 1080 lines of 1920 pixels each, which by pretty much anyone's definition is broadcasting in 1080i as defined. To state that they are altering that somehow to 1440x1035 is just a ludicrous assertion. There is no motivation of any kind for them to do that, because the equipment they use can easily do 1080i as defined, and to reinterpolate to something less is an extra step that has no benefit whatsoever (including cost) and would actually mean extra cost, extra equipment, and extra maintenance. Of course this site assumes everyone always uses 18 Mbps rates, too, so the reliability of it is already in question.


----------



## TyroneShoes (Sep 6, 2004)

AbMagFab said:


> ...
> 
> Got it?
> 
> In other words, if you want to see every pixel of a 1080i source, you need to use 1080i on your HR10 (or other receiver). As for the quality of the material and the amount of compression, that's independent of the resolution you are viewing, and won't impact the decision for the setting you use.


First of all, you don't ever "see every pixel of a 1080i source", in fact you see none of the pixels of the source, you only see the pixels that your set displays, so you are always (as long as you are not blinking) seeing all of the pixels you will ever see, regardless of the source, which is always the same 1920x1080 pixels for a 1080p set, or 1280x720 pixels for a 720p set.

The advantage of viewing 1080i content output at 1080 on a 1080 set is that you can maintain potential resolution and avoid rescaling (which if you'll forgive me for guessing, is probably more accurately what you were trying to say). That could be a significant advantage, or maybe it won't be, and maybe any advantage might be offset by other, even worse factors that make a 720 setting attractive on its own merits.

If the content actually achieves 1920x1080 resolution (which is rare in practice), AND if you have 20/20 vision, AND if you are approximately 3.3 picture heights from the screen, you could actually enjoy a part of that higher resolution advantage, but due to Kell factor you can only perceive about 756 vertical lines of resolution anyway, reducing the effective perceived rez quite a bit. On the other hand, if you are 12 ft away from a 60" 1080p set, you might have overpaid for the resolution you are actually perceiving, which will be somewhat less than even that. Once the pixels are smaller (from your further-away POV) than 1/60th of a degree of arc, more pixels buys you exactly nothing, meaning that any effort to maintain that potentially higher resolution is a completely wasted effort.

And there is one other important aspect, which is that not only do we want to preserve resolution, we want the pixels we do see to represent the image faithfully regarding exactly where those pixels appear on the screen, regardless of how resolved they are. If the deinterlacer doesn't do a perfect job, many if not 50% or more of those pixels are representing pixels that were each actually originally in a slightly different X/Y location. Except that since you have more pixels to work with at 1080p, it becomes even easier to see that they are not in the proper location. And having them in the exact right location turns out to be surprisingly important, possibly more important than a potential loss of a percent or two of perceived resolution.

And that's exactly why a 720 output setting on your HR10 might still be a better choice, even for a 1080p set. You would have to actually try it first and see, directly on your set, before you can make a sweeping declaration about which setting is "correct", even for owners of that particular set. Forget about whether that means it might be better on everyone else's set.

Got it?

Good.


----------



## Kash76 (Jul 29, 2001)

I have the Sony KDS-R50XBR1 which displays at 1080p but only accepts up to 1080i. I noticed tonight while my wife was watching American Idol that the logo was rigid on the circle on the 1080i setting from my H10-250. I changed it to 720p and it smoothed out. I don't normally see any difference but this time I did. I wonder if you are really picky if you are better running the native resolution of the show. I'm guessing because of the Fox NFL games being at 720p that American Idol is also??


----------



## jschmidt (Mar 4, 2003)

1080i at 60fps is really only 540 lines of resolution per frame (interlaced). 720p at 60 fps is 720 lines of resolution per frame. It sounds counterintuitive, but 720p is actually a higher resolution than 1080 if you take into account the temporal resolution (lines per frame and number of frames per second). In many cases 720p looks better than 1080i. 1080p at 60 fps would be the best, but nothing is available in that format yet. Your TV may upconvert to 1080p, but it's not getting any more information. You will probably need to experiment for a while to see what looks best on your set. It will be several years before any content (broadcast or pre-recorded) is available because of bandwidth limitations.


----------



## bdlucas (Feb 15, 2004)

TyroneShoes said:


> 1440x1035


A little googling turns up those number in connection with some content origination equipment, such as some Sony HD cameras. It appears this is a legacy resolution associated with the MUSE system. This is the internal resolution of the camera, e.g. the CCD sensor, and the camera upconverts it to 1920x1080 before it leaves the camera (as far as I can tell). I didn't do enough research to determine how widespread this is, e.g. whether it's only true of older HD cameras.

In other words, the "as practiced" seems to refer to the resolution (in the real original sense of the term resolution) of the content of some sources of HD material, not to the pixel format of the image at any point in the chain past the point where the signal leaves the camera.


----------



## herdfan (Feb 5, 2003)

I feed my Toshiba 1080p a 1080i signal. Even FOX and ABC 720 content looks stunning on it at 1080i. At 720 it still looks good, but not quite as good as 1080i. Since both the Toshiba and Samsung use the same DLP chip, your results should be the same, but YMMV.


----------



## AbMagFab (Feb 5, 2001)

jschmidt said:


> 1080i at 60fps is really only 540 lines of resolution per frame (interlaced). 720p at 60 fps is 720 lines of resolution per frame. It sounds counterintuitive, but 720p is actually a higher resolution than 1080 if you take into account the temporal resolution (lines per frame and number of frames per second). In many cases 720p looks better than 1080i. 1080p at 60 fps would be the best, but nothing is available in that format yet. Your TV may upconvert to 1080p, but it's not getting any more information. You will probably need to experiment for a while to see what looks best on your set. It will be several years before any content (broadcast or pre-recorded) is available because of bandwidth limitations.


Totally wrong. 1080i is 1080 lines. That 540p thing is old-school. I'm sure TS will jump in and give a highly technical explanation, but the upshot is that on 1080p sets today, they are capable of resolving 1080 lines from a 1080i source.


----------



## jhimmel (Dec 27, 2002)

Zooie, Aren't you glad you asked???


----------



## aaronwt (Jan 31, 2002)

I have the same 6168 set. I found the picture looks best if I leave it at 1080i. I am going through a DVDO VP30 scaler though before I go to the TV. I can't wait until my HD-DVD player arrives next month. It should produce an excellent picture. The WMV-HD films at 1080P look excellent on the 6168 using the vga port also.


----------



## TyroneShoes (Sep 6, 2004)

bdlucas said:


> A little googling turns up those number in connection with some content origination equipment, such as some Sony HD cameras. It appears this is a legacy resolution associated with the MUSE system. This is the internal resolution of the camera, e.g. the CCD sensor, and the camera upconverts it to 1920x1080 before it leaves the camera (as far as I can tell). I didn't do enough research to determine how widespread this is, e.g. whether it's only true of older HD cameras.
> 
> In other words, the "as practiced" seems to refer to the resolution (in the real original sense of the term resolution) of the content of some sources of HD material, not to the pixel format of the image at any point in the chain past the point where the signal leaves the camera.


Ahhhh! Thank you. That makes sense. IOW, they are claiming that some HD providers are using cams during acquisition with less than 1920x1080 rez, and broadcasting that as HD. That may be true, but I'm betting that those cams are old and fast disappearing.


----------



## TyroneShoes (Sep 6, 2004)

AbMagFab said:


> Totally wrong. 1080i is 1080 lines. That 540p thing is old-school. I'm sure TS will jump in and give a highly technical explanation, but the upshot is that on 1080p sets today, they are capable of resolving 1080 lines from a 1080i source.


Never fear, I pretty much agree. I just never paint myself into a corner with absolute statements such as "totally wrong." Why? because in this case, as in most, there is no absolute way to define this. Many modern TVs still deinterlace in a manner that simply repeats every other of 540 of the 1080 lines, including most if not all Sonys released prior to the A10. That means that 540p, while old-school, is still alive and well in many of our living rooms. And those sets still appear to have comparable rez, because the effective rez of the content still rarely approaches the potential.

And you are right. 1080i content is quite often captured as 1080p, which means that using 1080i to process and transmit it allows a 1080p set to reconstruct it as true 1080p content. Unfortunately, many of the 1080p sets will not allow 1080p content natively from the upcoming 1080p DVD formats. We'll have to wait and see if that is a problem or not.

Having a 1080p set for the advantage of resolution might not really mean much for a long time, and I have been harping on that for a while now, but they do appear to have (at least to me) and unexpected benefit, which is that SD content seems to look better on them than it does on sets with less native rez. Comparing a 768-rez Sony to a 1080p-rez Sony seems to reveal that while HD is slightly better on the 1080p set (primarily because of the LCOS engine and better black response), SD stuff looks significantly better. It appears that the reason is that the larger the pixels are, even if you are the prescribed distance away from the set (and shouldn't be able to discern pixels, even at 768 rez), SD upconvert artifacts such as increased mosquito noise seem to intermodulate with the pixel structure less on a 1080p set. The end result is less visible artifacts and less potential silk screen effect. The opposite is true of 720p sets like the A10, where the larger (than 768-rez) pixels seem to exaggerate SD upconvert artifacts.

Forget the rez advantage, better SD in a primarily SD world makes 1080p the only way to go. I bought too soon.


----------



## TyroneShoes (Sep 6, 2004)

jschmidt said:


> ...It sounds counterintuitive, but 720p is actually a higher resolution than 1080 if you take into account the temporal resolution (lines per frame and number of frames per second). In many cases 720p looks better than 1080i...


If the deinterlace is not done correctly, static images in 1080i have less potential V resolution than static images in 720p. If the interlace _IS_ done properly, static images in 1080i have _more _ potential V resolution than 720p. Moving images in 1080i have less potential V rez than moving images in 720p, regardless of the deinterlace used (about 500 lines under the best of circumstances). But 1080i (non DTV, anyway) has more potential H rez than 720p, regardless of the deinterlace, and regardless of whether the images are static or dynamic. 1080i from 1080p source content, deinterlaced properly, can preserve about 750 lines of effective V rez, which is the best available, and of course it also preserves all of the H rez. That is one case where 720p will not look better.


----------



## TyroneShoes (Sep 6, 2004)

Kash76 said:


> ...I noticed tonight while my wife was watching American Idol that the logo was rigid on the circle on the 1080i setting from my H10-250. I changed it to 720p and it smoothed out. I don't normally see any difference but this time I did. I wonder if you are really picky if you are better running the native resolution of the show. I'm guessing because of the Fox NFL games being at 720p that American Idol is also??


Yeah, right. My _wife_ was watching Idol. Uh huh. 

This is a good example of a situation where there is no extra conversion from one format to another. Not only is all content from FOX broadcast at 720p, Idol uses 720p cameras. (You might also notice that there is no transient pixellation on scene changes on Idol, shot at 720p, and there is on "Bones", which is telecined from film.) If you set the HR10 to 720, then you limit the reinterpolation to when your Sony upinterpolates it to the native rez of your display. The upinterpolation to 1080 in the HR10 leaves a lot to be desired (as opposed to that in your Sony, most likely). I use 720 on my 768 for all content, because otherwise it seems to have more jaggies on diagonal lines. Trading the potential loss of occasional rez for more accurate anti-aliasing of edges 100% of the time seems like a no-brainer.

The answer to your question is probably for you to do a similar experiment with 1080 content output at 1080. From what I have seen, using 720 still appears better, but then I have a 768p set. It might be different on a 1080p set.


----------



## Kash76 (Jul 29, 2001)

TyroneShoes said:


> Yeah, right. My _wife_ was watching Idol. Uh huh.
> 
> This is a good example of a situation where there is no extra conversion from one format to another. Not only is all content from FOX broadcast at 720p, Idol uses 720p cameras. (You might also notice that there is no transient pixellation on scene changes on Idol, shot at 720p, and there is on "Bones", which is telecined from film.) If you set the HR10 to 720, then you limit the reinterpolation to when your Sony upinterpolates it to the native rez of your display. The upinterpolation to 1080 in the HR10 leaves a lot to be desired (as opposed to that in your Sony, most likely). I use 720 on my 768 for all content, because otherwise it seems to have more jaggies on diagonal lines. Trading the potential loss of occasional rez for more accurate anti-aliasing of edges 100% of the time seems like a no-brainer.
> 
> The answer to your question is probably for you to do a similar experiment with 1080 content output at 1080. From what I have seen, using 720 still appears better, but then I have a 768p set. It might be different on a 1080p set.


Well maybe I was sort of watching also 

Thanks for the info, that does make sense!


----------



## ssandhoops (Feb 23, 2002)

AbMagFab said:


> Totally wrong. 1080i is 1080 lines.


Yes, 1080i is 1080 lines but the key here is the "i". Yes, you get 1080 lines of information but you only get half of those lines (540) on each scan. Compare that to 720p where you get 720 lines of information on each scan. So which is better, breaking up the display into 1080 lines and refresh half of those lines on each scan or breaking up the display into 720 lines and refreshing every line on each scan? I'm not making a case for either one because my eyes can't tell the difference, just wanted to point this out.


----------



## stevel (Aug 23, 2000)

The better TVs buffer and combine the two "scans" to form a 1080p frame for display. The scans are at 60Hz, combining to form 1080p at 30Hz. With 720p, you get 720p at 30Hz. Which is better? You tell me...


----------



## Kash76 (Jul 29, 2001)

That might explain my audio/video sync being off quite a bit!?


----------



## JTAnderson (Jun 6, 2000)

Why do people keep asking this question instead of just trying it?


----------



## stevel (Aug 23, 2000)

Kash76 said:


> That might explain my audio/video sync being off quite a bit!?


Indirectly, perhaps. There is quite a bit of processing that goes on for the video signal, and this can introduce a delay, typically 30-40ms or so. This is why some receivers, TVs and outboard boxes have audio delay circuitry and why HDMI 2.0 will specify its inclusion in all compatible products.


----------



## rcbray (Mar 31, 2004)

With my SXRD and HD Tivo, native should be best.

For 720P programming it goes to the SXRD and is scaled to 1080P(720 lines of original resolution and only one process). If I change it in the Tivo to 1080i (scales then interlaces) it goes to the SXRD and is then deinterlaced (original 720 lines of resolution after 3 processes).

For 1080i native goes from TIVO to SXRD. It is then deinterlaced (1080 lines of original resolution after only one process). If I change it to 720P in the Tivo and send it to the SXRD for scaling (deinterlaced and then scaled in the Tivo and then scaled again by the SXRD) it has only 720 lines of resolution and has gone through three processes. 

Therefore, assuming scaling and deinterlacing/interlacing functions are pretty comparable it is best to send native to the SXRD because it minimizes processing. In fact the SXRD does a better job which is all the more reason to send native from the Tivo.


----------



## aaronwt (Jan 31, 2002)

But with the HDTiVo it's just easier to set it at one resolution since it doesn't have a native format. That is why I leave my HDTiVos set at 1080i. Sometimes I will switch it to 1080i but I usually don't. The picture will be excellent either way with a calibrated 6168 Samsung. A professional calibration needs to be performed on this set to really bring out the best in the set.


----------



## TyroneShoes (Sep 6, 2004)

You're right, rc. And according to that logic what you say would be the best choice for your set, at least. Now why don't you try it and report back to us on how much difference it makes, or doesn't make?


----------



## TyroneShoes (Sep 6, 2004)

Kash76 said:


> That might explain my audio/video sync being off quite a bit!?


Not in the least. Different issue, no interaction.


----------



## TyroneShoes (Sep 6, 2004)

stevel said:


> The better TVs buffer and combine the two "scans" to form a 1080p frame for display. The scans are at 60Hz, combining to form 1080p at 30Hz. With 720p, you get 720p at 30Hz. Which is better? You tell me...


OK, but the answer is far from simple. For true 1080p acquisition, a 1080p set would be better. For 1080 acquisition that does not preserve the time relationships between the separate fields, a 1080p set would also be better. 1080p sets are typically better than any other choice, all else being equal, for all content. The rare exception might be 720p content, which implies at least a rescaling, and would not on a 720-native set. But rescaling is not very destructive, so its effect is probably negligible, even there. 1080p sets are new enough to probably all do deinterlace properly, too.

But not everyone can afford a 1080p set, not everyone bought a 1080p set (many buying before they were available), and not everyone even wants a 1080p set.

That about covers resolution and rescaling, but not motion artifacting. And it's actually 720p/60. 60 unique fields/frames a second, each field/frame providing 720 lines (this also explains why the pixel delivery rate is about 5/6ths of that for 1080i, and why 720p signals need slightly less bandwidth than 1080i for the same level of quality, another reason being that compression of interlaced signals is not as efficient). 1080 provides 60 fields, each composing half of a frame (every other line) for only 30 frames a second. At that point the scan rate of a 1080-native set is half that of a 720p set. Is that better? You tell me...

A 1080p set can double the scan rate to 60 if it is designed that way, which means it repeats each reinterlaced field twice. That will reduce flicker to the same level as a native 720p set, but might introduce slight motion judder. IOW, a moving object will be represented in a unique X/Y position 60 times a second for 720p, but only 30 times a second even at a 60 Hz scan rate for 1080p. Is that better? Maybe not, but for either delivery method, the 1080p set will still be better in most aspects. There is no trade-off or downside to a 1080p set compared to any other set, other than the price.

The gurus at ABC have gone on record very convincingly as to why they believe 720p delivery is superior, and it's not just spin. If they thought 1080i was better, nothing was stopping them, or FOX or ESPN, from choosing that delivery method. But they chose 720p instead, primarily because of less motion artifacting.

But true 1080p display technology shoots a few holes in their theory. 1080i delivery of 1080i content to a set that doesn't reinterlace properly is significantly inferior to 720p delivery of 720p content when there is high motion, but 1080i delivery of 1080p content to a 1080p set that does proper deinterlace is only slightly inferior to 720p delivery of 720p content when there is high motion, unless the bandwidth is restricted. When there is no high motion, true 1080i rules due to it's higher resolution, to sets with a greater than 720 native rez, anyway, and that deinterlace properly. And if a 1080p set simply reinterpolates 720p to 1080p, it will have the same freedom from motion artifacting (not to be confused with compression artifacting) for 720p content that any 768 or 720 set enjoys.

So, a 1080p set is better in almost every way, generally speaking. But it is not superior in every way, and it is not significantly better in some of the ways it actually is superior. But it is significantly superior in some critical ways. After seeing how a SXRD handles all content, especially SD content, I would never today buy anything less than that level of technology. I just wish that was available in October, 2004, although even today it is significantly more costly than the state of art was then.


----------



## jschmidt (Mar 4, 2003)

TyroneShoes said:


> If the deinterlace is not done correctly, static images in 1080i have less potential V resolution than static images in 720p. If the interlace _IS_ done properly, static images in 1080i have _more _ potential V resolution than 720p. Moving images in 1080i have less potential V rez than moving images in 720p, regardless of the deinterlace used (about 500 lines under the best of circumstances). But 1080i (non DTV, anyway) has more potential H rez than 720p, regardless of the deinterlace, and regardless of whether the images are static or dynamic. 1080i from 1080p source content, deinterlaced properly, can preserve about 750 lines of effective V rez, which is the best available, and of course it also preserves all of the H rez. That is one case where 720p will not look better.


I have read some of the disagreements with my statements. Allow me to clarify. De-interlacing works. No doubt about it... if you have a 1080p set that de-interlaces well, you can achieve 1080 lines of resolution, but at only 30 frames per second. 720p is 60 frames per second. It is 720 lines of resolution every 1/60 of a second. 1080i (even properly de-interlaced to 1080p) is still only 540 lines of resolution every 1/60 of a second. Some people prefer the picture of 720p for this reason. More frames per second = better ability to capture fast-moving action. It's better for sports and video games. Some people prefer 1080i for film content. My point is that you have to take into account the remporal resolution (number of frames per second) as well as the spacial resolution (number of pixels). Right now, it is not possible for existing interconnects (even on the pro market) to handle 1080p uncompressed. It requires something like 10Gb per second bandwidth, which we don't have yet. I don't think we'll see true, native 1080p content being fed to our HDTVs for quite some time.


----------



## AbMagFab (Feb 5, 2001)

TyroneShoes said:


> OK, but the answer is far from simple. For true 1080p acquisition, a 1080p set would be better. For 1080 acquisition that does not preserve the time relationships between the separate fields, a 1080p set would also be better. 1080p sets are typically better than any other choice, all else being equal, for all content. The rare exception might be 720p content, which implies at least a rescaling, and would not on a 720-native set. But rescaling is not very destructive, so its effect is probably negligible, even there. 1080p sets are new enough to probably all do deinterlace properly, too.
> 
> But not everyone can afford a 1080p set, not everyone bought a 1080p set (many buying before they were available), and not everyone even wants a 1080p set.
> 
> ...


Nice explanation. And yes, the SXRD is a nice set


----------



## zooie123 (Feb 22, 2006)

Hey all,

Thanks for so many informative responses. I have hooked all up and played for about 3 hrs with all equipment.

xbox360 - component 2, 1080i - works great, no lag in high action games, much better than 720p set

Samsung HT in a box - component 1 - 480p - looks fantastic, much better than 720p set

Hr10-250, hdmi, 1080i - regular sd channels look better than 720p set, HD content so far from 70-79 channels, a tad better, have not tried OTA yet.

All in all, I will play between 1080i and 720p output from dvr, but so far 1080i looks great. I am so happy with my purchase.

Thanks again all.


----------



## TyroneShoes (Sep 6, 2004)

jschmidt said:


> ...Right now, it is not possible for existing interconnects (even on the pro market) to handle 1080p uncompressed. It requires something like 10Gb per second bandwidth, which we don't have yet. I don't think we'll see true, native 1080p content being fed to our HDTVs for quite some time.


 I agree, and I think it's for a couple of reasons. The biggest reason is that 1080p sets can already reconstruct true 1080p content from 1080i broadcasts, so there is no advantage to ever broadcasting 1080p content as 1080p.

Another reason is because the broadcast industry moves at a glacial pace. We've had HD as a glimmer in our eyes since the early 80's, finally started to actively pursue it in the mid 90's, and here we are finally just beginning to implement it in a semi-serious way a decade after that. There has to be a motivating factor, and eating more precious bandwidth for no increase in content is just the opposite of that. Unfortunately, the factors that motivate the providers of HD are factors that are more likely to chip away at available bandwidth.

1080p as acquired takes only about 1.5 Gbps (only!). But that's still a lot of data, especially when the pipe is not very large, and it is impractical to widen the pipe very much at all. There is a finite amount of data that can be sent through a 6 MHz channel, and that is about 20 Mbps. A closed system can transmit about twice that much, because it doesn't need the overhead and reduncancy requuirements of a terrestrial OTA channel. But the providers are motivated to transmitting as few bits as they can get away with, regardless of what is technically possible. Uncompressed HD just doesn't exist for consumers, or outside of acquisition.

I actually don't ever expect to see 1080p content delivered as 1080p, either OTA or from other vendors. What we will see is more efficient ways of sending less information to represent more information. MPEG-4 is a perfect example of that. HD will get incrementally better, and there will be less "bad" HD as we go along, but don't expect bandwidth to never become precious, or major PQ improvements. Displays will improve a bit more than the content itself will, and they will become more reliable and less costly, but the change from SD to HD as we know it today is the most dramatic improvement we can expect to see in our lifetimes.


----------



## guyricardo (Feb 25, 2003)

TyroneShoes said:


> I agree, and I think it's for a couple of reasons. The biggest reason is that 1080p sets can already reconstruct true 1080p content from 1080i broadcasts, so there is no advantage to ever broadcasting 1080p content as 1080p.


I'm just trying to understand this point. If I've been following all this info correctly, ins't that reconstructed 1080p signal actually a 'doubled' 1080i? Meaning that they take two halfs of the 1080i signal and put them back together (deinterlacing?), then repeat that twice at 1/60s. If this is the case, I can see why it would be superior for slow moving or static images, but for faster motion, its still only producing 30 unique 1080p images per second, as opposed to 60 720p images?

I have no idea which would actually be better. I'm very happy with my 768 set. I keep the box on 720p, and alot of my programs are action and sports on ABC/Fox/ESPN. I've never really noticed any advantage in haveing it at 1080i, but maybe it's time to experiment again, especially now that I've added an A/V reciever in between.

Bottom line I take from all this is that if you're buying a new HDTV, 1080p would be better overall, and trust your own eyes as far as setting it up with all your other equipment.

....sorry for the ramble


----------



## rcbray (Mar 31, 2004)

TyroneShoes said:


> You're right, rc. And according to that logic what you say would be the best choice for your set, at least. Now why don't you try it and report back to us on how much difference it makes, or doesn't make?


A native 720P signal fed to the SXRD is marginally better than allowing the HD Tivo to upconvert it to 1080i. Believe the HD Tivo's replacement will have a native mode which will be great. However, today the improvement is so slight (and the 1080i upconverted signal looks so good) that I just keep the Tivo on 1080i output unless I know I'm going to watch several hours of 720P. Then I may go to the effort of changing the Tivo output to 720P.


----------



## whsbuss (Dec 16, 2002)

rcbray said:


> A native 720P signal fed to the SXRD is marginally better than allowing the HD Tivo to upconvert it to 1080i. Believe the HD Tivo's replacement will have a native mode which will be great. However, today the improvement is so slight (and the 1080i upconverted signal looks so good) that I just keep the Tivo on 1080i output unless I know I'm going to watch several hours of 720P. Then I may go to the effort of changing the Tivo output to 720P.


Same here on this end. I do watch Fox, ABC, ESPN sports events in 720p. But for normal viewing I just leave the Tivo in 1080i.


----------



## TyroneShoes (Sep 6, 2004)

guyricardo said:


> I'm just trying to understand this point. If I've been following all this info correctly, ins't that reconstructed 1080p signal actually a 'doubled' 1080i? Meaning that they take two halfs of the 1080i signal and put them back together (deinterlacing?), then repeat that twice at 1/60s. If this is the case, I can see why it would be superior for slow moving or static images, but for faster motion, its still only producing 30 unique 1080p images per second, as opposed to 60 720p images?...


That is exactly right. For 1080p to be broadcast with any advantage over 1080i (or to have the lower motion artifacts of 720p) it would have to be broadcast at the higher frame rate. The prevailing acquisition format is 1080p/24 for film. Acquisition of video is typically at 1080p/30 (or 720p/60, I believe). This means that only if video were acquired (and post-produced) at a higher frame rate, such as 1080p/60, would it have the lower motion artifacting of 720p.

Of course broadcasting twice as many 1080 frames would take significantly more bandwidth, meaning the compression ratio would have to increase (because the bandwidth will be limited to 6 MHz probably forever, even on cable and satellite). So the tradeoff there would not be worth it at least up until the point where compression technology becomes much more sophisticated than anything available today.

We might see the higher frame rate for video acquisition eventually, and there is talk of shooting feature film at a true 48 fps. But unless these things come to pass, which is not likely on any large scale within the lifespan of every brand-new 1080p set that will be sold this year, that means that broadcast improvements will not likely be seen, and the DVD telecine transfers at 1080p will also not really be better than 1080i broadcasting, even for 1080p sets.


----------



## TyroneShoes (Sep 6, 2004)

rcbray said:


> A native 720P signal fed to the SXRD is marginally better than allowing the HD Tivo to upconvert it to 1080i. Believe the HD Tivo's replacement will have a native mode which will be great...


There is talk of 6.2 coming next month, and that is supposed to allow a native pass-through for the HR10. We'll see. With any luck, this entire discussion will be moot.


----------



## dogdoctor (Feb 20, 2006)

TyroneShoes said:


> There is talk of 6.2 coming next month


Really...where is that information coming from? That would be cool, although I have my doubts.



TyroneShoes said:


> supposed to allow a native pass-through for the HR10.


 A bit OT so forgive me as I am new to a lot of the techincal terms. In short, would a native pass-through would allow 480i, 480p, 720p, 1080i signals to pass through the HR10 as 480i, 480p, 720p, and 1080i and then allow the HDTV to upscale, downscale depending? I assume the benefit of this is that the signal goes through less processing and less degredation? Please educate me. Thanks.


----------



## HomieG (Feb 17, 2003)

Also, I'm pretty sure that the ATSC spec for HD does not include 1080p at 30fps. The bandwidth to broadcast at higher frame rates may also be an issue for 1080p (broadcast).


----------



## DeDondeEs (Feb 20, 2004)

My TV has 720p native resolution, but if I switch my Tivo to 720p for some odd reason I get a slight headache when watching, no matter if it is 720p or 1080i programming. So I just leave it on 1080i. What could cause that? Perhaps the converter in my TV works better than the one in my Tivo. Also if something is broadcast in 720p and my TV is 720p native, does the TV still do anything to the signal coming across the HDMI or does it just pass through without conversion?

They should put a setting on the Tivo where it just passes through the resolution of the broadcast to your TV to up/down convert.


----------



## RMSko (Sep 4, 2001)

zooie123 said:


> Hey all,
> 
> Thanks for so many informative responses. I have hooked all up and played for about 3 hrs with all equipment.
> 
> ...


You may want to set the Xbox 360 at 720p. I also have a Sammy 1080p set and although I set my HR10-250 at 1080i, I set my Xbox 360 at 720p. First, many of the games are made in 720p and not 1080i so you would be passing the native resolution to the TV if you use 720p. Also, with games, just as with sports, fast moving action seems to look better at 720p, at least to me. It may be a matter of preference, but others have also reported similar results.


----------

