You must have MPEG-4 locals then?
You're claiming bitrates of roughly 4mbps for MPEG-2, which is beyond extreme compression for MPEG-2 HD.
What market are you in? I want to see the rabbitears.tv data.
Then it's not 3mbps or it's not MPEG-2. The borderline for complete trash on OTA HD is somewhere in the 6-8mbps range. I want to see rabbitears.tv data. Are they using MPEG-4? There are a few OTA stations using MPEG-4, most TVs, but not all can view it. Any TiVo other than the TCD648 (OLED Series 3) should be fine with MPEG-4 OTA.
It would be in a stat mux. You can bias a stat mux, but it still changes the bandwidth allocation multiple times a second. You also keep using GB/hr which is the wrong unit of measure. Broadcast is measured in mbps. You can roughly convert in your head, 10mbps is roughly 5GB/hour, but not quite because you have 3600 seconds in an hour but 8000MB in a GB (technically 8096 if you want to be really precise).
Again, I am not aware of a single station in the US broadcasting 19.3mbps on a single video channel, but that's what a physical channel broadcasts. That's the spec. The highest currently in the US is 16-17mbps, and most are 6-12mbps. At 6mbps, they can barely squeeze 2HD/4SD, which is currently the maximum any encoder manufacturer or broadcast engineer is going to claim they can do, although several channels don't care at all about quality and now have 3HD channels or more on a single broadcast, which pretty much guarantees awful VQ. You might be able to get away with 3x720p with no SD subchannels, as 720p compresses easier than 1080i.
If you can ID all channels that are broadcasting on that physical channel and record them all at the exact same time, you can figure out if they have "dead" or "wasted" bandwidth, but I doubt it with that channel load. PBS is the most likely culprit for "dead" bandwidth due to having a limited budget for dialing in their encoders properly. WEDH-DT had "dead" bandwidth for quite some time, and they are in a relatively major market (30ish DMA) that's adjacent to WGBH, WGBY, and WNET.
I think we need to check your definition of "good".
That's 14.2mbps which is quite respectable. NBC has gotten the most aggressive with channel sharing in a lot of markets (thanks, Comcast). You can get stunning HD these days out of 12mbps or higher, acceptable out of 8mbps or higher. Bitrate alone doesn't mean good VQ though, back in the MPEG-2 cable days I had a Comcast channel at 17mbps that looked mediocre at best.
Total apples and oranges. Supposedly HEVC is twice as efficient as H.264, which is twice as efficient as MPEG-2, so the 8mbps bar we hit today should eventually get down to 2mbps HEVC. However, even that's apples and oranges, as MPEG-2 encoding is very mature, HEVC is not. Further, the larger a stat mux group, the better results you can get, so using a mix of HD and SD channels on ATSC 3.0 will result in very efficient stat muxing.
Lastly, comparing Hulu (unless it's the live TV) to broadcast is again apples and oranges, since that stuff is offline encoded, which is always more efficient than online encoding. Even for streaming TV, they can use a VBR encode to a certain extent in order to save bandwidth while still offering better quality when it needs more bandwidth, and it's not up against one or two or three other channels in a stat mux, it's just some sort of constrained VBR. Netflix is notorious for really incredible encoding at remarkably low bitrates, because they throw both a massive amount of horsepower at their encoding, and a lot of intelligence. They're entirely offline, so they can have several racks of machines working on different resolutions and codecs of the same file at the same time, and they can spend minutes encoding every second of video. They're using some sort of narrow AI to detect the type of video or content in each frame of sequence of video, and then adjust the encoding down to match the content so that they use just enough bitrate at any given time to create stunning results. Broadcast doesn't have any of that relative luxury.