When shopping for HDTVs today, there are so many acronyms and numbers, it can make anyone’s head spin. Today we’re going to compare and contrast the differences of 1080i and 1080p.
Firstly, both 1080i and 1080p essentially represent the same resolution, 1920×1080. This is the highest resolution currently available for HDTVs, but just because a TV supports 1080i, it doesn’t mean that 1920×1080 is the actual resolution of the TV. HDTV sets apply a lot of processing, stuff that goes on in the background, to display 1080p (from Blu-Ray usually), 1080i (live TV, such as sports), 720p (film-based TV shows, such as sitcoms) and 480i/p (standard-definition).
The “i” in 1080i stands for “interlaced,” and what it means is that only half the lines are drawn in each frame. All types of TV (except Blu-Ray film content that may be encoded at 24 Hz) is displayed at 60 Hz, or 60 fields per second. With 1080i content, only the even horizontal lines are drawn in one frame/field, followed by only the odd horizontal lines in the following field. In a way, what this means is that in any given frame/field, only 540 lines of horizontal resolution are being displayed. If an interlaced image is showing fast motion, sometimes a type of breakup can be seen, as in the picture below.
The “p” in 1080p stands for “progressive,” and what it means is that all 1080 lines of resolution are shown in each frame/field. It is the best possible quality available today, and is what makes Blu-Ray unique.
Ideally, 1080p offers a better picture because every line and pixel is represented in each frame of video, as opposed to 1080i where alternate lines are drawn in each frame. In real life, on the best TVs, there will usually be no difference, as the TV performs a process calling “deinterlacing.” Deinterlacing adds up the even and odd lines of resolution to make an image with a full 1080 lines in each frame/field, and when done properly, basically turns a 1080i picture into a 1080p one.
If a manufacturer refers to a TV as being 1080i, it usually means that the TV’s native resolution is less than 1920×1080, because TVs that have that resolution are referred to as being 1080p TVs. These 1080i TVs will usually have a native resolution of either 1366×768, 1280×720 or 1024×768. The lower resolutions don’t necessarily mean that the TV is of inferior quality, as resolution is not the number one factor in image quality. The factors that make up image quality on a TV, in order, are:
1. Contrast Ratio
2. Black Level
3. Color Accuracy & Saturation
TVs Can Accept Resolutions Other Than Their Native One
Every high-definition TV has a native resolution, which is usually the spec that manufacturers use to differentiate them. However, just because a TV can accept a 1080p (or a 1080i one), it does not mean that the TV actually has 1920×1080 pixels.
TVs use processes called upscaling and downscaling to take an image that is lower or higher in resolution, and make it fit the resolution of the TV. While this may be common for 1080p TVs (they have to do this with standard definition content, as well as 720p content), 720p TVs can also take a 1080i/p image and downscale it to fit.
1080i and 720p usually refer to the same type of TV these days, but remember the factors that make up image quality because there are examples of 720p TVs that look significantly better than 1080p ones. Also, make a note of what your viewing distance is from the TV. If you sit too far, the benefit of 1080p will be lost.
A 1080p TV will be better for you if:
1. It is equal or better than a comparable 720p model in contrast, black level and color accuracy.
2. You sit close enough to take advantage of the increased resolution.
3. You are planning on watching 1080p content, such as movies on Blu-Ray disc.
4. If there isn’t too much of a difference in price.
Comments are closed.