Wednesday 16 August 2006

HD: 1080i vs 1080p

Can you tell the difference between 1080i and 1080p? Does that even mean anything to you? The debate is raging over at HDBeat and Home Theatre mag.

Basically, it boils down to the difference in the way the picture is scanned (displayed) on your TV. The i or p at the end means it is either scanned interlaced or progressively.

Essentially the difference between the two is that with an interlaced format the picture you see is scanned to the screen in alternating lines, so in any single scan the tv will only get half of the lines of the picture. Then in the next pass the alternate lines are scanned. The perception by humans is that the picture is complete because this all happens so quickly.

Progressive scan differs to this in that it scans all the lines at once, thus giving a clearer and flicker-free picture.

In theory then, progressive scan is superior to interlaced. But (and this is where the arguments start) most HDTVs like LCDs and plasmas for example, can't physically display an interlaced picture (only traditional CRT TVs can do this), so what they actually do when presented with an interlaced picture is de-interlace it and convert it into a progressive picture! So the argument goes that there is no difference between 1080i and 1080p because your HDTV will convert the 1080i signal into a 1080p signal anyway!

But it's the complications with de-interlacing that cause the problems, some TVs don't even do it correctly; you can actually end up with less resolution than you started with. And there are many other factors that effect the perceived quality of the picture; from the TV's refresh rate to the way the picture is outputted from your device (e.g. HD-DVD or Blu-ray player). You only have to read the comments in the linked articles above to realise how many factors there are.

High definition is a quagmire of specifications, misinformation, interpretations and subjective opinions. So, how does the average consumer cut through all the hyperbole and crap? Well, not easily is the honest answer.

I do think it's fair to say that the average consumer probably wouldn't be able to tell the difference between 1080i and 1080p, for most people a decent HDTV that processes any standard def or HD signal properly will be more than satisfactory. But the real enthusiasts will need to sit and wait until all of these theories can be put to the test, HD is still too much in its infancy.

At the end of the day, the best thing the average consumer can do is read lots of reviews and go and see the TVs for themselves. Before buying ask the shop to help you set them up and fiddle around with them with various sources that you will actually be using (e.g. Sky HD, HD-DVD or Blu-ray) and see which TV you think is best. You're the one that has to live with it, so you decide what's best for you.

No comments: