optimum video

Discussion in 'DirecTV TiVo Powered PVRs & Receivers' started by Old Moe, Aug 11, 2007.

  1. Old Moe

    Old Moe New Member

    6
    0
    Jul 27, 2007
    Reading various posts has left me somewhat doubtful about my method to obtain the highest quality picture. I have one 1080p TV and one 720p TV with an HR10
    hooked to each of them. I set the 1080p to 1080i on one HR10 and set the other one to 720p. I seem to get a high quality picture on both, but now I am beginning
    to wonder if I am missing out on an even better HD experience--- especially when I read some of these posts. Please advise .
     
  2. litzdog911

    litzdog911 TechKnow Guide

    12,027
    1
    Oct 18, 2002
    Mill Creek,...
    It's amazing how much confusion exists around the various HiDef video resolutions. Bottom line is that both 720p and 1080i are true HiDef formats. Some networks use 720p and some use 1080i. Supposedly 720p is a bit better for fast motion, such as sports broadcasts. But side-by-side you'll have a hard time seeing any difference.

    Here's something you can try ....
    HDNet broadcasts a 10-min HD Video Test Pattern (I think it's on early Tuesday morning, but check their online schedule). After you record the test pattern, try viewing it in both 720p and 1080i. I'll bet you won't see any difference.
     
  3. JimSpence

    JimSpence Just hangin'

    30,905
    36
    Sep 19, 2001
    Binghamton, NY
    Somewhat still on topic.

    Would you get a better picture feeding a 1080p TV with a 720p or a 1080i signal?

    Let the conjecture begin.:)
     
  4. TyroneShoes

    TyroneShoes HD evangelist

    3,604
    0
    Sep 6, 2004
    I don't think there should be any question about it. The only factors involved are resolution and deinterlace. We've already established that the HR10 deinterlaces 1080i to a 720p output just fine, and since virtually every 1080p set is modern enough to also deinterlace 1080i to 1080p just fine, that leaves resolution.

    Obviously, there will be a rez hit when outputting 1080i content to 720p, in theory. But that is only applicable if the actual source resolution exceeds that of 720p, which it rarely does. 1080i defaults to 540 rez in the H dimension whenever there is more than minimal motion, so that narrows the gap somewhat.

    But one of the few places resolution does realize its limits is in that test pattern that litzdog mentioned. For 720p output, it tops out at about 6.0 MHz on my set, while 1080i tops out at about 6.8 MHz, so there is definitely a noticeable difference, even on a set that only has 768-native rez.

    But since my 3-year-old Sony doesn't deinterlace as well as the HR10 and since raster-scanned video (film and tape) rarely realizes the capability of 1080 sharpness, 720 output seems to work best on my set.

    On a 1080p set that would likely not be the case. I would be very surprised if 720 was the better choice for a 1080i set. 720 content would not benefit, of course, but surprisingly, 480 content probably would, as ironically, it always seems to look best on a 1080p set.
     

Share This Page