# Hd Tivo that supports 1080p



## tivoguy28 (Apr 9, 2005)

I'm looking for a hd tivo that supports the 1080p format wheather it being the standalone tivo or through DTV


----------



## phox_mulder (Feb 23, 2006)

Why?

Nothing is broadcast in 1080p, and won't be for many, many years.

Only place you might find a 1080p signal is an HD DVD player, or an XBox360 or PS3(when it materializes).


phox


----------



## Ein (Jul 7, 2004)

phox_mulder said:


> Why?
> 
> Nothing is broadcast in 1080p, and won't be for many, many years.
> 
> ...


Of the above. only PS3 is capable of 1080p.


----------



## Dssturbo1 (Feb 23, 2005)

phox is right no need for it.
directv satellite, cable co or ota local broadcast are not going to be 1080p for a long time.

the PS3 will read a bluray 1080P disc but is it supposed to be capable of ouput in 1080P? doesnt matter for any hd tivo unit anyway just wondered


----------



## bkdtv (Jan 9, 2003)

> the PS3 will read a bluray 1080P disc but is it supposed to be capable of ouput in 1080P?


The $599 PS3 version with HDMI will. The $499 version will not.

HD-DVD and Blu-ray are both 1080p formats.


----------



## kturcotte (Dec 9, 2002)

At the moment, HD-DVD is only capable of 1080i.
And how many 1TB hard drives are you going to add to you HD Tivo to record these 1080p movies? lol


----------



## TyroneShoes (Sep 6, 2004)

Dssturbo1 said:


> phox is right no need for it.
> directv satellite, cable co or ota local broadcast are not going to be 1080p for a long time...


"long time" implies it will still happen. I don't think so.

1080p is a terrific display format, and just happened to work out to be the _de facto_ display format for any non-CRT display that can do 1920x1080 resolution, because they are all progressive by default. IOW, 1080p as a display format didn't emerge "over" 1080i because it's a "better" format than 1080i (it really isn't), it only emerged because no modern display can scan interlaced--they all scan progressively. It has nothing to do with 1080p being better than 1080i, it has to do with 1080p as a display format being easy and 1080i being impossible.

But 1080p makes no sense as a content or distribution format, because it takes twice the bandwidth, all else being equal, of 1080i. And since any 1080p set worth its salt can rescale 1080i back to true 1080p, there is no need for it, since it really doesn't improve anything beyond 1080i. 1080i is simply 1080p, rescaled non-destructively, and it can also be simply stitched back together as true 1080p with no artifact or resolution penalty, if acquired properly in the first place. So there's no point in 1080p as a distribution format, and likely never will be.


----------



## cheer (Nov 13, 2005)

Especially as the vast majority of source material is still 24 frames/sec film, right? So if it's telecined as 1080i (~60 fields/sec), a decomb filter is just going to pull it back down to 24 frames/sec progressive, right? (Correct me if I'm misinterpreting this...this is one of those subjects that I feel like I _almost_ understand.) So I wouldn't expect 1080p (60 frames/sec) to be any different, because wouldn't you still pull it back down to 24 frames/sec? (All of the above assumes, as TyroneShoes mentions, a progressive display device.) I would think that's what you want -- anything beyond the original 24 frames/sec is just padding anyway.

Of course, this doesn't apply to things shot on HD video, but that's usually interlaced anyway.


----------



## bkdtv (Jan 9, 2003)

> Especially as the vast majority of source material is still 24 frames/sec film, right? So if it's telecined as 1080i (~60 fields/sec), a decomb filter is just going to pull it back down to 24 frames/sec progressive, right?


Correct.



> "long time" implies it will still happen. I don't think so.


Well, never is a very long time.

There won't be much benefit to 1080p60 for film, but with appropriate repeat flags, it shouldn't be that different from 1080i60 today. The real benefit -- and challenge -- will be in delivering 1080p video.

When AVC, VC-1, or some other modern codec becomes the de-facto standard for video delivery at all levels in the chain (acquisition, studio, cable, etc), then I think you will see 1080p replace both 1080i and 720p. It's clearly not going to happen with MPEG-2. Nor is it likely to happen in the next ten years. Real-time encoders for these formats are still in their infancy, and it's going to be a long time before the computational power exists to deliver 1080p60 video in ~20Mbps, assuming such is even possible with AVC or VC-1.

These codecs (and their decoding requirements) were designed with today's technology in mind. But in ten years, processors could be 100x to 1000x faster, allowing for higher quality video at superior efficiency compared to what is available today.


----------



## TyroneShoes (Sep 6, 2004)

I guess I need to clarify some of what I posted earlier.

1080p/30 is a rescaled version of 1080i/30. 1080i/30 can be reconstituted back to 1080p/30 without loss. 1080p/60, well, that's a slightly different animal. 1080p/60 is the one that would double the bandwidth requirements, actually, but it would also not really improve motion artifacts all that much over 1080p/30, or at all over 720p/60. Still, it would be a slight technical improvement if ever adopted.

Much of the advantage of a 1080p set it that sets modern enough to be 1080p also typically do the rescale in a manner in which the benefits of original 1080p/30 material delivered as 1080i get reconstituted to real 1080p/30. Sets older than that typically retain the 1080i artifacts, even when rescaling to progressive. So again, it is not 1080p per se that makes a 1080p set attractive, as much as how it does 1080p.

Regardless of that, there is a huge bandwidth penalty for moving to 1080p/60 with little or no PQ improvement, almost none, in fact. That is not a very tempting offer for anyone, broadcast, cable, or DBS, who might have to deliver it.

Compression improvements are certainly viewed as opportunities, but in the reality of today's landscape always as opportunities to deliver more streams, never to deliver slightly improved existing streams. Doubling your channel count equates to profit, while tiny PQ improvements does not. If you have a choice, and they will, 1080p/60 just won't be the answer.

If compression technique ever improved to extreme lengths where bandwidth was plentiful and cheap, to the equivalent level of nickel-a-gallon gas for 100 MPG cars, maybe slight improvements with huge bandwidth penalties might start to make sense. But we'll probably none of us ever live to see it.


----------



## TyroneShoes (Sep 6, 2004)

cheer said:


> Especially as the vast majority of source material is still 24 frames/sec film, right? So if it's telecined as 1080i (~60 fields/sec), a decomb filter is just going to pull it back down to 24 frames/sec progressive, right? (Correct me if I'm misinterpreting this...this is one of those subjects that I feel like I _almost_ understand.) So I wouldn't expect 1080p (60 frames/sec) to be any different, because wouldn't you still pull it back down to 24 frames/sec? (All of the above assumes, as TyroneShoes mentions, a progressive display device.) I would think that's what you want -- anything beyond the original 24 frames/sec is just padding anyway.
> 
> Of course, this doesn't apply to things shot on HD video, but that's usually interlaced anyway.


You are sort of right.

But, I think we might be confusing two sometimes related but independent concepts, interlace/progressive, and pulldown. You can discuss the benefits and techniques of pulldown completely apart from the vagaries of various interlaced/progressive techniques, and probably make easier sense out of each. While they can impact each other, they can be considered separately as issues, as they are technically unrelated processes sometimes commonly found together in the larger equation.

That said, I think pulldown could be removed as a point from this discussion and make it a lot simpler for all of us, for now.

Also, HD is typically shot as progressive. Interlaced has pretty much gone away. 1080/24/60 is the standard for HD video for high-level production, and 720p/60 and 1080p/30 video cameras are becoming the norm for net and local TV video.

1080/24/60 is also common for telecine. 1080/24/60 is sometimes mistakenly considered an interlaced system, but it really is not. The confusion somes because the alternate lines are scanned to two different storage busses, even lines to one, and odd to another. While that sort of _sounds_ like an interlaced approach, that only refers to how the scanned lines are stored. They are still scanned as progressive, and the ultimate output at playback is progressive.

The beauty of 1080/24/60 as a standard is that you can convert it to any other HD standard on the planet easily and without inducing artifacts other than those limiting the resolution of the target standard itself.

Where pulldown becomes important is when acquisition is done as 1080/24/60. 24fps (actually 48fps are recorded) is used here specifically because of how well it translates to ATSC broadcasting and 1080p displays, and it also dovetails nicely with film telecine (also 24 fps). If done properly, it is pulled up to 60 fields allowing it to be transmitted as 1080i/60, and then pulled back down in the display as true 1080p/24.

It also compresses nicely, because the pulled up fields are to some extent duplicates of previous fields, and a significant fraction of them can be are ignored or discarded by the encoder. At the decoder end, the pulldown either recreates them from existing fields that were originally duplicates, for those sets that can't reconstitute true 1080p, or just uses the fields that actually were encoded to recreate true 1080p/24 for those sets that can accomodate that.


----------



## actionj (Sep 2, 2004)

Whoever said that HD-DVD is 1080i is incorrect. All HD-DVDs are 1080p. It is the Toshiba HD-DVD player that is only able to output the movies at 1080i. The next batch of HD-DVD players should be able to output the full 1080p. If you currently have a 1080p TV, then it should be able to grab the 1080i signal from the Toshiba HD-DVD player and add in the extra lines to make it 1080p anyways. 

As for PS3/XBOX I am unware of any current XBOX 360 games that are made in 1080p and have heard no plans for games to be in 1080p. I think a majority will be 720p/1080i


----------



## kturcotte (Dec 9, 2002)

Yes, the HD-DVD discs are mastered in 1080p, but the current players can only output 1080i. I should have specified current players.


----------



## ryttingm (May 17, 2005)

It is also important to note that while the first generation Toshiba Players only output video at 1080i. The Samsung blu-ray player uses the exact same broadcom chip as the Toshiba player to decode the data from the disc. In order to produce the 1080p output on the samsung, A second de-interlacing chip was added to take the 1080i signal from the broadcom chip and convert it back to 1080p. There is no real benefit that the samsung player provides with it's 1080p output. In both cases it gets interlaced and de-interlaced. In the case of the Toshiba the interlace happens in the player and the de-interlace happens in the display. For the Samsung they both happen the player. Here's a very informative thread on avsforum about this and other Blu-Ray vs HD-DVD FUD.

Arg can't even put a URL as text. I'll edit the post with the url once I have enough posts

http://www.avsforum.com/avs-vb/showthread.php?t=718337

I apologize for not making the link linkable, not enough posts


----------



## kturcotte (Dec 9, 2002)

Well that's kind of stupid. What's the point of that? That justifies the extra $500? Do they make a stand alone processor so I can de-interlace and re-interlace the signal coming from my DirecTivo 10 million times? Look just great it will!


----------



## Dssturbo1 (Feb 23, 2005)

they did it for the same reason the OP is asking for it.......marketing hype+sales


----------

