longhornsk57's profile

Mentor

 • 

33 Messages

Friday, July 13th, 2012 1:39 AM

what's up with 1080i setting but no 1080p?

So I ordered HD upgrade. On the DVR settings I figured I'd see what settings there are, so I go the screen settings and the highest one is 1080i...

What's up with that? Am I really getting interlaced frames? Can this DVR not output 1080p?

What am I missing here?

I have to believe you guys are getting 1080p...

Mentor

 • 

33 Messages

12 years ago

Yeah that was established near the first post.

What came up now was that Oz was saying my 1080p TV could somehow take a 1080i signal and turn it into a 1080p signal, and I was saying that isn't possible unless I lose half of my frames.  And unless Uverse is giving me 60FPS of 1080i (not sure why they'd be doing that) then what he's saying isn't possible, unless I missed something.

ACE - Expert

 • 

27.6K Messages

12 years ago

"...It seems to me the reason Uverse doesn't stream 1080p @ 30FPS is it would take up too much bandwidth..."

 

That's the statement I was responding to but then again, I'm an i diot when it comes to all this technical mumbo-jumbo.

Mentor

 • 

33 Messages

12 years ago

Yeah since I'm super late to the HD game, I figured we were up to the point where service providers could to 1080p now, but it looks like it is still too much bandwidth.  So yeah you're right, only Blu-Ray and game consoles and stuff can do it.  1080i is still TONS better than that SD crap I was watching before though, plus my phone actually outputs 1080p on everything i record like with the video camera or movies/streaming, so that's pretty cool to watch on a big 1080p TV.

Master

 • 

4.2K Messages

12 years ago

No its 1080i at 60 frames which is odd even hence interlaced and 1080p is actually 24fps as is what film is or video which can be 30fps there is no such thing as 1080i 30fps as far as I know which would result in a bad picture IMO. 1080i = 30frames of even and 30 frames of odd where as 1080p has 30 frames of the combined in a single frame.

 

http://www.axis.com/products/video/camera/progressive_scan.htm

Mentor

 • 

33 Messages

12 years ago

Right but what I am saying is that you are not going to get a 1080p quality picture from 1080i, you cannot get a progressive image from 2 interlaced images no matter what TV you have because the raw data isn't there.

Master

 • 

4.2K Messages

12 years ago


@longhornsk57 wrote:

Right but what I am saying is that you are not going to get a 1080p quality picture from 1080i, you cannot get a progressive image from 2 interlaced images no matter what TV you have because the raw data isn't there.


You are right in the affect you are not going to get Blue Ray quality but its do to more varibles than just interlaced vs progressive but your flat panel will show native resolution.

Mentor

 • 

33 Messages

12 years ago

Yeah I hear that, and I optimized my TV yesterday with the visual EQ settings (LCD - 1080p) and was watching some of the HD stuff and it looked really amazing, so yeah I definitely like the quality.

ACE - Master

 • 

6.9K Messages

12 years ago


@longhornsk57 wrote:

Yeah since I'm super late to the HD game, I figured we were up to the point where service providers could to 1080p now, but it looks like it is still too much bandwidth.  So yeah you're right, only Blu-Ray and game consoles and stuff can do it.  1080i is still TONS better than that SD crap I was watching before though, plus my phone actually outputs 1080p on everything i record like with the video camera or movies/streaming, so that's pretty cool to watch on a big 1080p TV.



Has nothing to do with bandwidth. Even OTA broadcasts are 1080i.  It's a network issue not a provider issue. Their is a reason on one provider (DirectTV) has any 1080p content and those are strictly on demand movies.

Expert

 • 

9.4K Messages

12 years ago

Longhorn,

All modern TVs have a video processing chip inside them usually called the scaler/deinterlacer. This chip is responsible for changing any incoming signal format into what the display can actually display natively.

The scaler/deinterlacer is the chip that is responsible for changing the 1080i/60 signal from the broadcaster into a progressive signal that the (LCD Panel, Plasma Panel, etc.) can display.

It converts interlaced video into progressive video by using motion-adaptive interpolation.

A 1080i signal does not have frames. It is a sequence of fields that alternate between the top field (scan lines 1, 3, 5, etc.) and the bottom field (scan lines 2, 4, 6, etc.). There are 60 fields per second in a 1080i signal. Each field is independent in BOTH space and time. That means that the pixels in each field come from a different part of the image captured by the camera (spatially independent) AND each field was actually snapped by the camera at a slightly different time (temporally independent). That is why a 1080i signal cannot be said to have "frames" because you do not get one continuous picture if you combine two adjacent fields together.

Your TV takes the stream of fields and buffers them, and carefully examines each field to identify objects that are moving and objects that aren't. It then applies two different algorithms to the fields to create full frames using the information in the current field, as well as information in the previous and subsequent fields. The result is a 1080p signal at 60 frames per second, each frame consisting of half actual pixel information from the 1080i signal, and half algorithm-created information.

The modern scaler/deinterlacer chips in TVs do this job fairly well, although there are wide variations in quality from different manufacturers.

A 1080p signal that you get from Blu-Ray is not 60 frames per second. It is typically 24 frames per second, which matches the film rate. 24 frames per second looks very "juddery" when played back because there is not nearly as much temporal information as a 1080i signal has. (A 1080i/60 signal has 60 updates per second of temporal information, corresponding to the 60 fields. This is more than 2.5x what a 1080p/24 signal has).

Because a 1080p signal has far less temporal information, the bandwidth requirements for 1080p are actually less than what is required for 1080i. So the issue with providers not doing 1080p is not a bandwidth issue. The reason they typically don't do it is because not every HDTV out there can process a 1080p/24 signal. Early HDTVs (typically rear-projection CRT models) could display 1080i signals only, because they actually still use electron beam scanning, which has always been interlaced in consumer equipment.

Mentor

 • 

33 Messages

12 years ago

Wow.

 

Thanks for that explanation, it totally makes sense and I can see I had some confusions about what exactly interlaced vs progressive was  🙂

Not finding what you're looking for?
New to AT&T Community?
New to the AT&T Community? Start by visiting the Community How-To.
New to the AT&T Community?
Visit the Community How-To.