Separate names with a comma.
Discussion in 'DirecTV TiVo Powered PVRs & Receivers' started by feldon23, Jan 1, 2004.
How do you qualify for the easy feed?
Doug, how you, me, and Darin are keeping this topic relatively innuendo-free is beyond me.
We have class.
Which is highly over-rated.
It's not over yet.
I agree with your statement, but the truth of the matter is that the majority of people have no clue what resolution programs are broadcast in. Frankly, there should be no reason that a consumer should have to ascertain what resolution a particular broadcaster decides to broadcast in. For the record, I am in the clueless group as far as what the networks are broadcasting in, but I do consider myself to be somewhat knowledgeable in the area of Home Theatre and consumer electronics.
Honestly, the whole HD roll-out is just a mess. If the standard supports multiple resolutions (like 18 of them) then the displays they are shown on should automatically switch to that resolution. An override would be available for people who wanted to customize. But what the heck do I know...
Where is the signal processed for things like brightness, contrast, 3:2 pull down etc. when a DVI connection is used? In the source device or the display?
Obviously, if you are displaying on a fixed pixel display, then conversion happens in the display or its outboard processor.
I am asking because I have a follow up question, but I want to find an definitive answer before I ask it.
The ONLY display technology in use today that doesn't have an inherent "native" resolution is CRT. CRT can simply change the number of lines it does in one scan to accommodate any number of resolutions. But unfortunately, most CRT HD sets available don't take advantage of this ability. The electronics within them are designed to scan at one or maybe two specific frequencies (usually 1080, and maybe additionally 480 for SD), and separate electronics in the set run the image through interpolation when the incoming signal does not match that resolution. Why they do this, I don't know, because the picture would look best if displayed at it's original resolution rather than interpolated to another. Practically all CRT based computer monitors, even cheap sub-$100 ones, do multi-scan, so why this ability can mostly only be found in high-end projectors is a mystery to me.
Anyway, all the other display types have a fixed pixel geometry, no matter what you do with the incoming signal, they only have a specific amount of physical pixels to work with, so they MUST interpolate the incoming signal to match their native resolution, if it's not the same already.
So this is why most (all???) STBs don't simply output whatever resolution the original signal is, because unless you have one of those relatively rare CRT sets that can do multiscan, the TV is going to convert it to a specific resolution anyway. So all things being equal, it's better to do that conversion in the STB while the signal is still digital, and it ensures the guide and other STB generated graphics look better (they don't end up going through an interpolation).
That's why I said the ability to switch output resolutions would benefit high-end CRT owners, because this would let them override the automatic interpolation to a specific resolution, and get the best PQ possible out of the incoming signal. And those who wanted to take advantage of that ability would most likely know which networks are which. For everyone else, leaving it set to the native resolution of your TV is generally best. FWIW, to the best of my knowledge, ABC, ESPN, and in the fall, FOX, use 720p, while all the others use 1080i.
Brightness and contrast are generally done in the display device. 3:2 pull down would not be done in the display device unless you were sending it a 480i signal. So if you have your STB set to 720p, then if any 3:2 pulldown was done to SD programming, it would have to be done at the STB.
My CRT set supports 480p & 1080i. So a STB outputting native resolution would work fine for most of my channels. ABC on the other hand outputs in 720p, which my CRT doesn't support. So if the STB outputs 720p I'm going to get a blank screen. This would be a bad thing. Assuming I knew that ABC was 720p and that my CRT won't support 720p I'd still have to manually switch the STB to output either 480p or 1080i to get a picture. If the STB just side-converts everything to one resolution (that my CRT supports) all I have to know is how to change the channel.
Plasmas will usually do 480p and 720p. So I'd be able to watch either Fox or ABC & ESPN-HD without having to know anything. PBS, NBC, CBS, UPN, HBO, etc. would give me a blank screen. And we're back to knowing what channel outputs what signal.
It's much easier for the masses to output one (supported) resolution at all times. Those with more knowledge and a compatible set can adjust the output resolution to their heart's content.
I think that the best all-around feature would be to be able to select whatever output you want, OR to enable a "native" mode option that would pass whatever signal the show is in. It looks like for whatever reason that neither the HD-Tivo nor the Dish 921 have this option though.
While I think it's great that technology moves forward, and we are seeing all sorts of new display innovations, it's too bad that most of them are fixed pixel designs. DLP, LCoS, LCD etc. This is why I still say that an RPTV, if you have the room, is the best way to go. They still offer the best display, and are a bargain because of the new technologies.
My RPTV will display 480i, 480p (?), 960i, 1080i, and it accepts 720p. The reason I put a question mark after 480p is because I am still researching that mode, and the details are sketchy. It is thought that all resolutions below 720p are converted to 960i on my particular set (Sony 57-WV700). Apparently 960i is very easy for the set to convert to, and gives a very solid picture. 720p is up-converted to 1080i. I don't like this idea because I believe that a 720p display will look better than a resolution of 1080i because of the interlacing. It's a shame that the best looking resolution (in theory) is more expensive to reproduce and therefore ignored by most (all?) manufacturers.
When I said there are 18 formats for HD, I didn't mean that the display device could or should display all of them. Since the networks have "standardized" on a few resolutions, what I would like to see is the native display of the ones that are being broadcast - without the end user having to figure out the resolution of the broadcast and then selecting the output on the STB. You brought up a really good point, and it's one that supports my whole "HD roll out is a mess" theory. There is conversion going on in at least one place in the signal path.
It bothers me that the HD roll out has so many compromises in it. This was the perfect chance to do a standard correctly, but because of various (read political) reasons, it is a compromise. On top of that it is a confusing for consumers.
At least we are moving forward. Albeit rather slowly.....
Are you sure the display won't convert a 720p signal to something else?
I believe that plasma displays output at a fixed resolution, so regardless of what signal you feed them, there is conversion of the signal to match the native resolution of the display. Am I making an incorrect statement?
So what does the display device do with a DVI (digital) signal? Is the ONLY thing it does is convert it to analog for display?
I also have a Sony set (46WT510), and it actually DOWNconverts 720p to 480p, and many Sonys do the same. And 960i is technically 1080i... all they do for 960i is put put it in a 1080i frame, then vertically stretch the image so the black bars are off-screen. There is also a guy at HomeTheaterSpot who claims that 480 signals are always converted to 960 (and hence, 1080) on Sony CRTs. I'm not sure if I believe that, but if it's true, then these sets only have one scan rate.
There are sets that don't support 720p AT ALL (they won't do an internal conversion), but it's usually moot since virtually all STBs give you an option of what resolution they output. The fact that my set downconverts 720p is also moot, since my set never sees 720p... the signal is converted to 1080i in the STB before it ever sees the TV. But I agree, the no compromise solution would be for my set to be able to natively disply 720 lines as well as 1080, and have the STB output whatever resolution the programming came in at. But I think the fact that so few displays actually can do multiple resolutions natively is why no STBs have the option to pass the signal in it's native format. In fact, my STB has the output resolution selection as a physical switch in the BACK of the unit... it's meant to be a set and forget setting.
Well, that's going to depend on the set, and I can't pretend to know what they all do, but I think generally, if it's an analog display (CRT), then it's converted to analog, then processed like any other input. If it's a digital display, then it stays in the digital domain, and any processing is done digitally.
Yep, I'm sure. I have a Zenith HD-DVR that outputs 480i, 480p, 720p, and 1080i. Pressing the button on the front of the STB cycles the output resolutions. And while I was incorrect for the sake of convenience in saying that 720p will give me a 'blank screen' what it does in reality is give me a signal that looks like an old VCR that should be outputting on channel 4 instead of channel 3. A vertical line down the middle of the screen with both halves of the picture on the wrong sides and wavy horizontal lines. 480i does give a blank screen as the component inputs can't handle that signal.
My projector does 480p, 720p, and 1080i. But it looses sync for a couple of seconds when the signal type changes. So I leave the DVR at 1080i and don't worry about ABC too much. Alias still looks great.
My main point is that there is a large number of people spending a large amount of money on TVs that have no idea what they're really getting. I personally know a gent who thought that his "HDTV Ready" set would give him HDTV from every channel on basic cable. He had no idea that he'd need a STB and (for OTA) an antenna to receive true HDTV. Or that only certain channels were broadcasting in HDTV. And only at certain times. Or that he couldn't use the yellow video cable for HDTV. At least he'd upgraded from coax. I only found out when he was disappointed in the "HDTV" that he was seeing after hearing me rave about my PQ. A quick trip to his house and a bit of explaining only started to set him straight. Telling him he'd need to spend another $500 or so on a HDTV STB didn't make him too happy.
The same type of thing happens to folks that think they're getting 5.1 DD from the red & white audio jacks. Their receiver supports it so they must have it, not matter what connector they used, right?
Fortunately every HDTV Set-Top Box on the market lets you pick the output resolution and converts between them. Very few HDTVs can handle 720p natively and many Plasmas can't handle 1080i so this is the perfect solution.
The only people left out are those with high-end equipment that can handle 720p and 1080i and display them natively. I don't know of any STBs that let you set it to "Output Source Format".
The Sony HD-200 and Zenith HD-SAT520 do (as I'd expect the Sony HD-300 and LG-branded version to also do): The native setting converts 480i signals to 480p and passes along 720p and 1080i signals directly to the display without conversion. The hybrid 1 and 2 settings are similar, but hybrid 1 converts all HD signals to 1080i, and hybrid 2 converts them to 720p. EZ DVI automatically detects the monitor type and converts all signals accordingly.
This is a valuable feature when you have a decent display. I have the Fujitsu P50 plasma display and it's absolutely annoying using it with the Echostar 6000 receiver where you have to switch manually between HD and SD output modes, and it only offers 720p or 1080i all-the-time choices for HD modes. I much more enjoy the behavior of the HD-SAT520. It provides easy access to all the advanced scaling features in the plasma that will only function on 480i and 480p signals.
Apopos of nothing, nearly every single plasma (every one?) on the market supports 1080i -- even if they are only a 480-line display.
Several plasmas cannot be fed 720p.
Plasma owners are not generally going to miss the "no native output" thing; neither are CRT owners really.
Neither is pretty much anyone really.
I mean a few people would like it; but as a practical matter very few people will really be missing it.