HD1080I , Compression quality, Contrast
I am giving out serious kudos now to FOXHD for it's production team and broadcast team excellence. Hands down the best HD in my area is Discovery HD, ALL the NBCHD stuff, PBSHD with ESPNHD, MHD, and MOJOHD always delivering fabulously.
Thumbs up FOX, thanks for the very well crafted and inspirational attention to the Declaration of Independance before the opening.
All in all, the best image you get comes not from resolution, but from well managed digital video compression technology. Your picture 1080i in HD is all digital so your problems will not have anything to do with interlaced or progressive scanning, but with MPEG2. It's a lossy method for getting your video to you, and at any rapid scene change you see obvious block recovery and macroblocks in high action stuff. What you want is H.264 or Mpeg4 ( DISH ) or VC-1 ( blu-ray and HD-DVD ). unfortunately comcast uses harware MPEG2 compression codecs, at broadcast and at Set Top Box, so it's all about bandwidth allocation when its coming down the cable and into your livingroom. The next level in HD programming delivery is to improve the use of MPEG vs bandwidth. Right now Comcast ok but allocation to very high bandwidth to FOX-HD channels is up to each C.O., in my area its excellent.
Game on - will attend to the rant on contrast and ratios after the game.
CONTRAST and RATIO
You will understand much better why the Phillips and Sony ambient light detectors are more important than you may have realized, and add value to your experience in ways you should not take for granted. Study this:
Contrast is probably the #1 "quality of experience" and therefore also a a top choice parameter for selecting a good widescreen HD display, I see this a lot elsewhere on other HD sites. The truth is a bit dicey though, in that the source material can and many times does, have an older style and NTSC color range profile, and what you see for contrast may actually be the result of production editing in that video rather than display capability. It is often possible to look at a display in the store and base your choice on poorly edited contrast in the source content on-screen at the time.
Manufacturers and Product people in HD displays know this, and often include filtering and color tuning in the display processing firmware, to maximize the full range of thier image at any given moment. Sometimes this is a user-controlled option in the SETUP menu of the display, sometimes its bundled into a more general "Color Enhance" option in a menu selection. In my opinion, this is a double edged blade, in that you can benefit greatly most of the time, but some of the stuff coming in will be mishandled by this process. The reason the contrast test demo above is in grayscale is because of that, since Color Saturation-Gamma-Intensity has its measurements based on processing that happens in a GRID anaylsis ( large areas of screen are sampled for ratios ), making the in-Scene and scene-to-scene details part of the processing. Basically for you this means the Contrast numbers in the specs are not sufficient to describe really what to expect from the display.
I would like to see Black Level identified more in a standard spec, since that is often one of the few display characterizations that is not dependant on source. This means - see the display with no content at a medium backlight setting. ( some displays detect this and darken to thier max value) - In Store the lighting is too bright for you to assess black levels, so this is where you should ask more questions of an experienced salesperson ( if available). Plasma Displays in particular used to be rather poor at black level and in almost all new 1080p plasma screens you see now, that has been corrected and most 1080p Plasmas are excellent. Some LCD screens will also bleed the white Backlight and not provide a true enough black level for you... my Advice is to read the customer reviews, many times the truth about a black level quality is presented there.
I have found that the large box brands these days pretty much ALL do this well in newer designs, Phillips and Sonys ( and others ) measure room light and set contrast to suit and most of the time that is goind to produce the preferred look to intensity and contrast.
Its really important to note however, the worst out there is the ABC stuff,
... that page is almost completely wrong, its kind of pathetic.
as an aside - PLASMA and LCD tv's do NOT SCAN so any mention of digital scan lines is pure nonsense. Just let me say you righteous 720p people are your own worst enemy, since 720p is decompressed , upscaled to 1080 and recompressed for cable HD1080i broadcast, making it the blockiest fuzziest looking image I get in HD. Maybe Over The Air HD is better, but everyone i know is using DISH ( kudos for the MPEG4 BTW its very good) and Cable.
If you want Over The Air 720p then go : http://antennaweb.org/aw/welcome.aspx
and aim our antenna and hope for the best. Second to the worst is TNTHD that does this BIG HEAD upscaling. I wont watch either channes, except that WCVB weather is probably the best in new england, i wouldnt see any ABCHD.
360 days left to Digital Cutover