# 720p or 1080i take up more HD space?



## ayrton911 (Sep 4, 2000)

So I've been curious for a while, if one or the other: 720p or 1080i takes up more hard drive space? 

I might jump to say 1080i takes up more space, but being interlaced does it make it just about the same amount of data as 720p? How does this affect our hard drive space on our TiVos? Has anyone noticed anything, for example, if you record 720p almost exclusively (ABC, ESPN, FOX)? 

Thanks.


----------



## RunnerFL (May 10, 2005)

From everything I've read and heard on the HR10-250 it doesn't matter which takes up less space, you can't tell it what resolution to record in. It records them all, or at least only records at 1080i so that it can display them all when played back.


----------



## JimSpence (Sep 19, 2001)

He's not really asking about the recording resolution, but if shows that come in 720p or 1080i take different amounts of space. I say yes, 720p should take less space than 1080i. However, with DirecTV using "HDLite" the difference probably isn't very high.


----------



## newsposter (Aug 18, 2002)

Very curious question indeed...wonder when they will start 1080p and how much that will take up?  

However since I wont let if affect what i record, the point is really moot.


----------



## norneslo (May 25, 2004)

I'd seen some other posts on this before also, and like Jim said above, generally 720 should take less. What I'd read was that most if not all 720 currently is at 30fps, so you are getting 1280X720 image 30 times a second (921600 pixels), vs. for 1080i, 1920X540 (960000pixels) @60fps (it's interlaced). You have to say, given both are at the same bit rate, you should be getting more data with 1080i @60 (half)frames per second. If someone broadcast 720 at 60fps then they'd be almost equal again. 

As Jim also mentioned, down rezzing and bit starving by DirecTV further reduces the size of each frame, so essentially when recording DTV HD channels you may get much more than the advertised 25 hours of recording time. 

My guess on 1080p is that you'd get the half frames of 1080i transmitted as 30 complete 1920X1080 frames, so again, only if it was stepped up to 60fps would it take more space.

I read somewhere that with regards to the new HD DVD and Blu-ray that they actually are 1080p/24 (film) and have a lower data rate than 1080i. I'm not sure the full context of that statement, I sort of assumed it meant a 1080p TV could accept a 24fps rate from the DVD player. They author in the article was making a point that some 1080p (at 24fps) is lower overall resolution than 1080i - deinterlaced. 

Ron


----------



## newsposter (Aug 18, 2002)

norneslo said:


> As Jim also mentioned, down rezzing and bit starving by DirecTV further reduces the size of each frame, so essentially when recording DTV HD channels you may get much more than the advertised 25 hours of recording time.


And if you do most HD stuff OTA like me, i'm betting a lot less


----------



## ayrton911 (Sep 4, 2000)

Thanks everyone. The programs I record tend to be 720p, so I was trying to see, if perhaps I get a bit more recording capacity. ha-ha. Maybe I do then. 

A friend of mine has a DishNetwork HD DVR. I just asked him, and he says when he deletes a 720p program he'll sometimes notice only 45-50 minutes come off the "estimated HD Time remaining," whereas he'll see 70 or 80 minutes come off on a hour 1080i program. So maybe as we've said there is a difference.


----------



## btwyx (Jan 16, 2003)

The resolution is irrelevant, its the bit rate sent to you that matters. The bit rate is not tied to the resolution. The bit rate of a full ATSC channel is 19.2Mb/s (or something close). That can deliver 1080i, 720p, 480p or 480i, or 1080i and 480i, or 1080i and 4x480i or 720p and 480i or ..., or ..., or.... The broadcaster gets to decide what bit rate to send you.

The more subchannels a broadcast has, the less data is in the main channel. That's for over the air. What D* sends you has been recoded and recompressed in a lot of cases, what people call "HD Lite", so it has a lower bit rate, it also doesn't have any sub channels. I've heard quotes of about 12Mb/s for most HD channels.

Still independent of resolution.

Also D* has clever dynamic encoders, they can change the bit rate sent to you on a moment by moment basis. If the encoder decides that one channel needs more bits than another channel (on the same transponder) it can rob bits from one channel to supply them to another channel, depending on the compressibility of the content. Conent with more movement is less compressible. So if you channel is on at the same time on the same transponder as a football match it may have substantially less bits than something broadcast at 3am on the same transponder as a shopping channel.


----------



## ayrton911 (Sep 4, 2000)

That's very interesting. Thank you btwyx.

I would have thought that 720p might require a lower bit rate than 1080i, but I guess that is not the case. 

Also interesting how the bitrates are always changing on DirecTV. 

Thanks for giving me these things to think about as I watch.  Adds to my understanding.


----------



## TyroneShoes (Sep 6, 2004)

ayrton911 said:


> ...I would have thought that 720p might require a lower bit rate than 1080i, but I guess that is not the case...


Well, actually it is the case before compression. If you look at it by how many raw bits it takes to represent each pixel as being a fixed amount of data, which actually is the case for uncompressed HD, there are more pixels in 1080i than in 720p. There are actually quite a lot more pixels in 1080i than in 720p, but there are twice as many frames transmitted per second, conventionally, in 720p, so that evens it out somewhat. But taking that into account, uncompressed 1080i still has a higher data rate than uncompressed 720p.

Therefore it either takes a higher data rate to transmit 1080i than it does 720p to get the same PQ or same level of (lack of) artifacting, or the same data rate if you allow a little more compression of 1080i than 720p.

So, if all other things were held equal, 1080i would take more disk space. It doesn't really work out that way, however, for a number of reasons.

Furthermore, while most things actually are equal about 1080i and 720p, such as number of quantization levels and color space, interlaced video is a little harder to MPEG compress efficiently than progressive video, so typically even a little more bit rate is needed to provide comparitive PQ.

The end result for most HD by the time it reaches the end consumer is that 720p ends up being compressed at typically about 72:1, while 1080i ends up being compressed at about 83:1, for the same data rate. And this is one reason why motion artifacting is a little more of a problem with 1080i, even beyond the interlace factor.

So, btwyx is exactly right that the bit rate is the determinant of how much space is used up, and in no way is the resolution the determining factor. But, equivalent bit rates will give lower PQ for 1080i over 720p.

It really does not work out that 1080i streams are larger or more dense than 720p streams, however, because there are factors that limit what the vendor (OTA, cable, DBS) can send you that take resolution or the differences between what can be done with 1080i vs. 720p pretty much out of that picture.

And while we have control over whether the end product is displayed in one resolution or another, we have no control over the sent or recorded bit rate.



ayrton911 said:


> ...Also interesting how the bitrates are always changing on DirecTV...


 That's the nature of temporal compression, and of statistical multiplexing. Without both of those, it would take a great deal more bandwidth to provide the same service and quality level.


----------



## stevereis (Feb 24, 2006)

norneslo said:


> I'd seen some other posts on this before also, and like Jim said above, generally 720 should take less. What I'd read was that most if not all 720 currently is at 30fps, so you are getting 1280X720 image 30 times a second (921600 pixels), vs. for 1080i, 1920X540 (960000pixels) @60fps (it's interlaced). You have to say, given both are at the same bit rate, you should be getting more data with 1080i @60 (half)frames per second. If someone broadcast 720 at 60fps then they'd be almost equal again.


This is not correct...
- 720p is 720p60 _(not 720p30)_
- 1080i is 1080i30 
The last number refers to the frame rate/s, so
- 720p60 = 1280X720 frame 60 times a second (60*921,600 pixels/sec)
- 1080i30 = 1920x540 fields 60 times a second (60*1,036,800 pixels/sec)

Thus, the pixel rate for 720p is almost the same as for 1080i and that's why the standard broadcast bitrate of 19.2 Mb/s yields essentially the same PQ.


----------



## Cheezmo (Apr 26, 2004)

And since DirecTV only broadcasts 1080i at 1280x1080i, it only requires about 2/3 the bitrate to deliver 2/3 the quality that 1080i should, so of course they give it less than half most of the time 

2/3 the resolution, 1/2 the bitrate, ...

I saw Verizon's FIOS today and even the SD looked good. Imagine that if you can. I'd forgotten that SD can actually be watchable.


----------



## Andrew_S (Nov 12, 2001)

newsposter said:


> Very curious question indeed...wonder when they will start 1080p and how much that will take up?


As far as I know, 1080p is not a broadcast standard. So the answer to this question is... 0.


----------



## NMLobo (Aug 12, 2006)

Ron[/QUOTE]



norneslo said:


> I'd seen some other posts on this before also, and like Jim said above, generally 720 should take less. What I'd read was that most if not all 720 currently is at 30fps, so you are getting 1280X720 image 30 times a second (921600 pixels), vs. for 1080i, 1920X540 (960000pixels) @60fps (it's interlaced).
> 
> Ron


With 1080i you are getting 2073600 pixels 30 times a second. Some less expensive sets take only 540 lines and double them to create 1080i while the better sets interleave and present all 1080 lines within 1/30 of a second.

"1080i conveys the images in an interlaced format (the i in 1080i). Sources get "painted" on the screen sequentially: the odd-numbered lines of resolution appear on your screen first, followed by the even-numbered lines--*all within 1/30 of a second.* "

The interlace scan process makes two separate scans of an image appear as one frame. In the case of 1080i video, this means that there are two fields of 540 lines each that are perceived as a single video image. Scanning 60 fields per second instead of 30 entire frames per second has numerous advantages, most notably less flicker, improved detail, and reducing bandwidth used by half.


----------



## mr.unnatural (Feb 2, 2006)

You can't select what resolution the HDTivo records in. You get whatever the source is sending you. All you can do is select the resolution for the signal output to your monitor. It doesn't matter whether the source is DTV or OTA. The digital stream is recorded to the hard drive exactly the way it was transmitted. 

The output is determined by the resolution setting you have selected. If the original signal was transmitted in 720p and you have 1080i selected as your output then the data gets upconverted to 1080i. If the source was 1080i and you select 720p then it gets converted to that output. If the output is set the same as the cource it's simply passed through directly with no conversion.

1080p is an HDTV standard, but there are no stations currently broadcasting in 1080p.


----------



## Andrew_S (Nov 12, 2001)

mr.unnatural said:


> 1080p is an HDTV standard, but there are no stations currently broadcasting in 1080p.


You're right. 1080p/24 and 1080p/30 are defined in the ATSC standard but 1080p/60 is not.

link


----------



## TyroneShoes (Sep 6, 2004)

Andrew_S said:


> You're right. 1080p/24 and 1080p/30 are defined in the ATSC standard but 1080p/60 is not...


And there are a couple of good reasons why even if it were, you would likely not see it ever used for broadcasting, cable, or sat.

The biggest reason is again, the data rate. 1080p/60 would require an uncompressed data rate of about 3 Gbps. If you squeeze that into a SMPTE310-formatted 19.39 mbps channel, the compression artifacts would be significant, greatly outweighing any possible benefit. Of course cable or sat could use this format and keep the same data rate by cutting in half the number of HD channels they offer, but that certainly will not happen. Since there is not likely to be any 1080p/60 content for them to turn around in the first place, even if they came up with a new compression scheme that would not have the artifacts of either MPEG-4, MPEG-2, or J2000, it still would be unlikely that this format would see the light of day for subscriber delivery.

The other big reason is that there is absolutely no advantage to 1080p/30 over 1080i (for 1080p displays, which will soon be the norm), and a likely insignificant advantage of 1080p/60 over 1080i, especially for the data rate penalty needed to enjoy that slight potential advantage.

1080i can deliver 1080p/30 quality to 1080p displays, assuming the original content was acquired as 1080p (and most is). A 1080p display can buffer each field of 1080i and simply stitch the two of them back together and display them progressively, which is 1080p/30 by definition. It is a completely non-destructive process, meaning that the 1080i formatting does not compromise the original 1080p PQ.


----------



## TyroneShoes (Sep 6, 2004)

stevereis said:


> ...
> - 720p60 = 1280X720 frame 60 times a second (60*921,600 pixels/sec)
> - 1080i30 = 1920x540 fields 60 times a second (60*1,036,800 pixels/sec)
> 
> Thus, the pixel rate for 720p is almost the same as for 1080i and that's why the standard broadcast bitrate of 19.2 Mb/s yields essentially the same PQ.


"Essentially", being the operative word. But 10% fewer pixels per second means 10% less artifacting on motion, again with all else held equal. And as the stations drop the bit rate from 18 Mb/s (standard for O&O CBS stations) to as low as 9-12 Mb/s or further, the artifacting for 1080 over 720 becomes more and more noticeable.

Add in what is NOT equal, such as interlace factor, interline flicker, and the fact that interlaced video does not compress as efficiently as progressive, and PQ equivalency becomes significantly less "essentially the same".

Bottom line, 720p will have better perceived PQ on motion, while 1080i will have better perceived resolution on still images, assuming those images have better resolution to begin with. And the difference becomes more dramatic as the bit rate drops.

720p seems to fill the bill a little better, because moving images are the nature of television. Otherwise it would just be radio with pictures, and 1080i would then be just fine.

Even "essentially the same" has noticeable differences.


----------



## ayrton911 (Sep 4, 2000)

mr.unnatural said:


> You can't select what resolution the HDTivo records in. You get whatever the source is sending you. All you can do is select the resolution for the signal output to your monitor. It doesn't matter whether the source is DTV or OTA. The digital stream is recorded to the hard drive exactly the way it was transmitted.
> 
> The output is determined by the resolution setting you have selected. If the original signal was transmitted in 720p and you have 1080i selected as your output then the data gets upconverted to 1080i. If the source was 1080i and you select 720p then it gets converted to that output. If the output is set the same as the cource it's simply passed through directly with no conversion.
> 
> 1080p is an HDTV standard, but there are no stations currently broadcasting in 1080p.


Yeah I know you don't select the recording resolution. The purpose of the original post was because I record ABC, FOX, and ESPN exclusively, which are all 720p.

However, as I've learned, that really has no impact on how much space it takes up, since the bit-rate determines it.


----------



## newsposter (Aug 18, 2002)

so if you dont watch dvds it seems pointless to get a 1080p tv right?


----------



## ayrton911 (Sep 4, 2000)

newsposter said:


> so if you dont watch dvds it seems pointless to get a 1080p tv right?


Well, from what I've heard people say, 1080p televisions look better with 720p and 1080i content than 720p televisions. I don't exactly understand why that is the case, but I've heard many who are supposedly knowledgeable in this area state this.


----------



## TyroneShoes (Sep 6, 2004)

All I know is what I've seen, and from the same seating distance my sister's 60" Sony 1080p SXRD seems to provide better PQ for 480i content than my nearly-identical 60" Sony 768p LCD does (both calibrated by me). I was really not expecting that. It does 720 and 1080 a bit better, too, only partly because there is a better absolute black level on SXRD.

I sit a bit farther than 7.5 feet, which is the distance for a 768p set where the screen-door effect comes into play, so one would think that this would not have any bearing on things. But whatever the reason, there is a noticeable difference between PQ of HD and SD content on my set, and a much less-noticeable difference on hers. Bottom line, you apparently don't need 1080 material to see a benefit on at least some 1080p sets.


----------



## newsposter (Aug 18, 2002)

TyroneShoes said:


> All I know is what I've seen, and from the same seating distance my sister's 60" Sony 1080p SXRD seems to provide better PQ for 480i content than my nearly-identical 60" Sony 768p LCD does (both calibrated by me). .


do you mean when it's set to 480i or not? If you mean set on something higher then it sounds like the same mystery when i tell people my crt rptv makes SD look so much better when i kick it up. Must be the tv


----------



## mr.unnatural (Feb 2, 2006)

The SXRD will always look better than an LCD RPTV (at least better than any currently available). It just does blacks much better than LCD ever could. I would hardly call them "nearly identical" unless you're talking about the styling of the outer case. LCoS will also never suffer from the screen-door effect that plagues LCD rear projection sets.


----------

