Separate names with a comma.
Discussion in 'TiVo Roamio DVRs' started by ljknight, Nov 26, 2013.
Most people don't understand resolution (like 4K) and bit rate, you can provide high stated resolution at a low bit rates and most people will not know, OTA could be 1080I at 19Mb/s and so could cable but you can also get 1080i at much lower bit rates and the picture quality will suffer even if your HDTV tells you your resolution is 1080i.
In Rochester NY the local Fox station broadcasts in 720p and a 1 hr prime time show is almost 2X as large as 1 hr prime time show from my local ABC & CW stations which are also both broadcast in 720p. The reason is because the ABC station (13.1) has the local CW station as a sub-channel (13.2) and equal splits the band width between the 2 and the fox station (31.1) has no sub-channels. The ABC & CW shows look ok but I would say they are clearly not as sharp as the shows from stations that are using more bandwidth (have a higher bit rate). Which gets back to my point of not believing the OTA broadcasters will be willing to spend money to broadcast in 4K (or even 1080p), as it is clear my local ABC & CW broadcaster is willing to cut quality down substantially to avoid any increased costs broadcasting on 2 frequencies would incur.
They also take into account the adjacent markets and the possibility of interference, so that's usually the reason for the holes in your channel lineup.
Let's assume that broadcast 1080i is 15Mbps with MPEG-2 compression. If we assume H.264 results in a 50% compression advantage over MPEG-2 and H.265 has another 50% compression advantage over H.264 then a 4K stream would still require 30Mbps. (that's 4k/60fps) The problem is that 50% compression advantage is not really obtainable with realtime encoding they use in broadcasting. So expect more like a 35-40% compression advantage, which lands the bitrate between 45-50Mbps.
Now Netflix can pre-encode with multi-pass encoders and get the max compression advantage. Plus they can also use 24/30fps. Which means they could get their 4k streams down to 12-15Mbps which is actually feasible for streaming.
Everything is relative.
You can deliver 1080P at 5Mb/s and 720P at 40Mb/s. The bitrate is not set in stone on any system. There are set units for resolution and frame rate, but bitrate can be anywhere on the [allowed] map. To make matters worse, the quality of the mastering and encoding can vary too, making a certain bitrate OK in one case look yucky in another.
Another thing to consider is that as resolution goes up, the bitrate increase is not relative. 4K video might be 4 times the resolution, but might need only 50% more bandwidth to achieve the same quality. It gives me a headache
All other factors being equal (i.e. codec, frame rate, GOP length, etc...) bitrate should scale proportionally with resolution. Although there is a floor for MPEG-2, so if you try to go too low, even if you scale the video accordingly, the results will look like crap. H.264 is much better at handling very low bitrates.
That should not be the case for all lossy types of compression. At least, that is not my understanding. I thought some significant gains in modern codec bandwidth are achieved by "re-using" pre-encoded parts of the screen. And in such a case, a zillion more pixels of similar info can be encoded at not much penalty. Of course, this doesn't work for really complex scenes. My stupid 50% example could be way off, of course.
Am I missing something? Maybe I am crazy.
Technically that's true, if you were able to do a multi-pass encode, but in broadcasting there is a lot of reatime encoding going on so they have standardized on a much simpler set of parameters that don't tend to benefit much from the fancier stuff the codecs are capable of. This is especially true for MPEG-2 which has been in use since the mid 90s back when the hardware was a lot less capable and HD didn't exist yet.
Problem is, 1080i broadcast MPEG2 has been much lower than 15Mb/s. More like 10Mb/s or even lower. Most Broadcasters don't care about quality. Many will stuff multiple sub-channels in, and sometimes one of those sub-channels is also in HD. So all the channels are bit starved and look like crap. I would expect things to be no different if they ever tried to broadcast 4K. I would fully expect it to be bitstarved as well.
EDIT: I just looked at some of the bitrates of my recent OTA recordings from 1080i networks in my area(NBC, CBS, and the CW)
The bitrates ranged from 11.4Mb/s to 13.65 Mb/s. Then if I look at the two 720P networks, FOx and ABC I see bitrates of 9.4 Mb/s for ABC and 14.4Mb/s for Fox. The only network in my area that doesn't have any sub-channels is Fox. And they have the highest bitrate.
This bit rate bull is the fly in the ointment as people can be given information on resolution etc but the spec on the bit rate is an unknown for most people so ones picture on a less costly HDTV may be better then a high cost HDTV that has a lower bit rate signal. That the biggest scam now going on in HDTV
There are a lot of scams. Like them being "LED TV" when they are not (THAT is the biggest scam). Or built-in smart stuff that will be abandonware after a few years. Or the "viewable" vs. "advertised" size stuff which is coming back again. Or the total lack of control with the "smoothing" and frame rate interpolation (which can be extremely irritating when you want to turn it off or down). USB ports that don't supply appropriate power. Documentation that doesn't explain anything. Default settings that so overblow color and brightness it is insane. Lack of ability to turn the TV's speakers into a center channel for external amplifiers. 3D that is so bad it is like an instant eye/headache hyper-flicker machine. Lack of useful specs info ANYWHERE, like viewing angles, HDMI spec, MTBF, input restrictions, even full dimensions.
Not that I am picky or anything....
I'm a bit bummed about the smart stuff in my Samsung TV. They have newer apps with added features but only release them for the newest model TVs. They have some $300 module you can buy that replaces the entire "brain" of older TVs so they can run the new apps, but it only works for the two most expensive models. I bought the 3rd tier one, which was still $2K, and it can not be upgraded. I still use it for VUDU and HBOGo, but I've converted to using my Roamio for Netflix now that the app is actually usable. (unlike the Premiere)
Exactly. And they burn through models quickly. Every year or two it is "obsolete" and pretty much useless. They have absolutely no incentive to support the "old" models. This is why a TV should be just a monitor.
I agree. Maybe there is a case for lower cost TVs being a "TV" but the high end stuff should just be a monitor and focus on providing an exceptional picture only, nothing else. Anyone buying a $2000+ "TV" is likely going to have a full home theater setup and not need or use anything but the monitor part of a TV.
It's not really a scam. It is an LED LCD TV. You average person doesn't know about any of this stuff. It's just that many people are ignorant and don't realize that the LED is just the back lighting of the LCD set. I run into this at work all the time when we install an LCD set. All the ones we use now are LED backlit and most people we run into have no clue. Which is typical for A/V related stuff. Your average consumer doesn't have much of a clue with audio and video devices. I see it on a weekly basis.
I can't count the number of people I've run into that spent a ton of money on a TV and don't have full home theater setup. They have big LCD and just use the TV speakers with a few components connected to it. Or even worse they have some kind of cheap speaker bar type audio system connected to this expensive TV. And then on top of that the picture will look like crap, over saturated etc,
In past years it was easy to spend lots on a so so TV - 5 years ago I spent $2200 on mine and it was not the top of the line. But I find it hard to believe people are spending $2000+ plus now and not getting great TVs. You can get the Panasonic TC-P60ST60 60-Inch TV, which CNET rated as having the 4th best picture of all TVs this year for under $1500 and a Panasonic TC-P65ZT60 65" which is CNets highest rated TV ever for under $3000.
Of course if people insist on buying LCD TVs they will end up spending more and not get as good picture quality. But you can still get a great 60in LCD (Samsung UN60F8000 60") for around $2000.
I guess there are people who will buy these top of the Line TVs and just use them without a home theater setup, but is sure seems like a waste. Frankly if someone wants to do that they should just buy something like a Panasonic TC-PS60 which you can get for $1150, they will still have a great picture in a no frills TV. What blows my mind is that the 50" Version is now $700 and likely better than the TV I bought 5 years ago for $2200.
I'm not saying that their Tvs were not good, only that their settings were not. Your average person has no idea how to adjust the picture properly on their TV. Whether they paid $4k or $400. Of course your average person seems to also like overstaurated colors, which is another thing that makes the picture look really bad on some of the sets I've seen. I don't expect people to do an involved calibration but at least the basics of adjusting color, tint, contrast, and brightness goes a very long ways to having a better looking picture.
I gave up ajdusting the settings on TVs at work, because when I would usually come back to the set, someone will have messed with settings, oversaturating the colors and also making the picture too bright. So I don't even mess with them any more.
No it isn't. It is an LCD TV. We didn't call LCD with fluorescent backlights "fluorescent TV" or "fluorescent LCD TV" did we? It would be like switching the battery in an internal combustion engine car from lead acid to lithium and calling it a "lithium car". Absurd!
I know. And they fall for the marketing crap.