Mass Media & Technology

Half Frame Width HD Video Backhaul

KNBC backhauls two satellite feeds in one 1920x1080 frame (November 2019)

This site closed in March 2021 and is now a read-only archive
RK
Rkolsen
I’m curious if any of you have seen this before. KNBC on Thursday had their primary anchors on site commemorating the one year anniversary of the Borderline Bar and Grill mass shooting in Thousand Oaks, CA. They fed their primary anchor feed back to the station as a single full frame 1920x1080, however a secondary satellite feed had this:

*

This feed had a cutaway shot along with a reporter standup. As you can see it’s two 960x1080 video feeds, which are cut in half and resized for air. Apparently on air they look like it’s a full 1920x1080i frame without issue and KNBC has done this before. This comes from a friend of mine, blizzard59 over at TVNewsTalk. He has a satellite receiver that can pick up the various backhauls that are unencrypted and never seen it done elsewhere.

I assume they are probably setup that way as most satellite trucks built have two paths out and at a memorial event parking maybe tight or restricted to bringing a second truck in.

So I am curious has this appeared in the UK or elsewhere? Is this some compression technique brought up by satellite encoders where a two 1920x1080 feeds are encoded together and decodes to two outputs?
DO
dosxuk
This was how 3D was transmitted, with one half for each eye.

The reduced horizontal resolution would have an effect on picture quality, maybe they're using this as a way of multiplexing multiple previews onto one feed, rather than for taking to air?
RK
Rkolsen
This was how 3D was transmitted, with one half for each eye.

The reduced horizontal resolution would have an effect on picture quality, maybe they're using this as a way of multiplexing multiple previews onto one feed, rather than for taking to air?

I don’t know, I realize switchers can easily crop and enlarge through a DVE but seemed weird. As for previews I don’t think it would work as those shots, specifically the off angle one would have to be synced up with the main feed. The person didn’t see any other feeds from KNBC up on the satellite (unless it was encrypted) so another mode of transmission would have to have a delay (or the primary camera needed a delay).
GE
thegeek Founding member
Never seen that before!

We do, however, use HEVC coders to send four 1080p pictures in a 2160p raster in an 18MHz carrier for remote production.
NG
noggin Founding member
A number of US platforms use lower resolutions than 1920x1080 for emission (i.e. the final leg to the viewer) - with 1440x1080 (as BBC HD used to use), 1280x1080 and 1080x1080 all used I believe - so the resolution loss caused by a 50% horizontal squeeze may not be that huge.

DVC Pro HD - which was a widespread codec in use for production (as it's supported by pretty much every platform - FCP, Avid, Prem Pro, Quantel/GVG - and operating system) uses 1440x1080 in 1080i25 and 1280x1080 in 1080i29.97 too (the lower frame rate of 50Hz systems means you get a bit more resolution)

I've worked on shows where video screens are fed by 1/4 screen crops and zooms of a single quad split graphic source (rather than tying up 4 replay server ports, you use one and some re-sizers) It's fine if the screen is small in frame, or lower resolution (or defocused)

The other system that used to be used to get two video signals into one was I think CamPlex or VidiPlex(?) which sent one field of each source - so you got a 288 or 240 line feed of each source (the decoder field-doubled back to 576 or 480) with half motion.
FG
FraserGJ
I've worked on shows where video screens are fed by 1/4 screen crops and zooms of a single quad split graphic source (rather than tying up 4 replay server ports, you use one and some re-sizers) It's fine if the screen is small in frame, or lower resolution (or defocused)


This sounds like how GMB/This Morning must be doing it for their screens - I've seen photos taken in the GMB studio after it's gone off-air with the This Morning 'windows' showing like a patchwork on the screens - obviously the raw source you mention before it's cut up and fed to each 'window'
DO
dosxuk
I've worked on shows where video screens are fed by 1/4 screen crops and zooms of a single quad split graphic source (rather than tying up 4 replay server ports, you use one and some re-sizers) It's fine if the screen is small in frame, or lower resolution (or defocused)


This sounds like how GMB/This Morning must be doing it for their screens - I've seen photos taken in the GMB studio after it's gone off-air with the This Morning 'windows' showing like a patchwork on the screens - obviously the raw source you mention before it's cut up and fed to each 'window'


This is absolutely standard for media servers used for feeding content to screens. Rather than having one source device for every screen, you have one (plus backups!) that creates a feed that is distributed and sliced up. Many professional screens will do this slicing internally (mainly designed for easily creating video walls), but there are a range of devices to connect in the middle.

If you see an arena show with 50 odd LED screens, all playing different content, it's almost certainly just one output from a computer rather than an entire room full of them.
RK
Rkolsen
A number of US platforms use lower resolutions than 1920x1080 for emission (i.e. the final leg to the viewer) - with 1440x1080 (as BBC HD used to use), 1280x1080 and 1080x1080 all used I believe - so the resolution loss caused by a 50% horizontal squeeze may not be that huge.

DVC Pro HD - which was a widespread codec in use for production (as it's supported by pretty much every platform - FCP, Avid, Prem Pro, Quantel/GVG - and operating system) uses 1440x1080 in 1080i25 and 1280x1080 in 1080i29.97 too (the lower frame rate of 50Hz systems means you get a bit more resolution)

I've worked on shows where video screens are fed by 1/4 screen crops and zooms of a single quad split graphic source (rather than tying up 4 replay server ports, you use one and some re-sizers) It's fine if the screen is small in frame, or lower resolution (or defocused)

The other system that used to be used to get two video signals into one was I think CamPlex or VidiPlex(?) which sent one field of each source - so you got a 288 or 240 line feed of each source (the decoder field-doubled back to 576 or 480) with half motion.

Here the only standards used are 1920x1080 and 1280x720 and as it’s an NBC station it will be 1920x1080 (Fox and ABC are the only 720p only broadcasters). There used to be HD Lite on satellite where it was 1440x1080 and some cable companies in an effort to fit 100+ HD channels have downgraded the signals to 720p (but they cannot convert or manipulate broadcast channels on their system). The ATSC standard only supports those two for HD, but have been used for distribution. If there are multiple HD services the 1920x1080i may be downgraded to 1280x720p or SD widescreen, so those odd resolutions wouldn’t be transmittable.

Edit : However if they are using a DVE or resized for multiple boxes (not sure what it’s called when there are more than one video feed on air) it may not be noticeable.
NG
noggin Founding member
A number of US platforms use lower resolutions than 1920x1080 for emission (i.e. the final leg to the viewer) - with 1440x1080 (as BBC HD used to use), 1280x1080 and 1080x1080 all used I believe - so the resolution loss caused by a 50% horizontal squeeze may not be that huge.

DVC Pro HD - which was a widespread codec in use for production (as it's supported by pretty much every platform - FCP, Avid, Prem Pro, Quantel/GVG - and operating system) uses 1440x1080 in 1080i25 and 1280x1080 in 1080i29.97 too (the lower frame rate of 50Hz systems means you get a bit more resolution)

I've worked on shows where video screens are fed by 1/4 screen crops and zooms of a single quad split graphic source (rather than tying up 4 replay server ports, you use one and some re-sizers) It's fine if the screen is small in frame, or lower resolution (or defocused)

The other system that used to be used to get two video signals into one was I think CamPlex or VidiPlex(?) which sent one field of each source - so you got a 288 or 240 line feed of each source (the decoder field-doubled back to 576 or 480) with half motion.

Here the only standards used are 1920x1080 and 1280x720 and as it’s an NBC station it will be 1920x1080 (Fox and ABC are the only 720p only broadcasters). There used to be HD Lite on satellite where it was 1440x1080 and some cable companies in an effort to fit 100+ HD channels have downgraded the signals to 720p (but they cannot convert or manipulate broadcast channels on their system). The ATSC standard only supports those two for HD, but have been used for distribution. If there are multiple HD services the 1920x1080i may be downgraded to 1280x720p or SD widescreen, so those odd resolutions wouldn’t be transmittable.

Edit : However if they are using a DVE or resized for multiple boxes (not sure what it’s called when there are more than one video feed on air) it may not be noticeable.


Yep - it was Dish and/or DirecTV who use (or used) the lower horizontal resolution variants - as low as 1080x1080 I believe. People with modified receivers who could access the decrypted transport stream were able to analyse the video parameters ISTR. Things may have changed since then. AIUI the satellite platforms are allowed to re-encode the local broadcast channels they distribute - as they use h.264 rather than MPEG2 (which is still used OTA) for distribution these days ?

ATSC mandates (or kind of tried to) specific resolutions for OTA and cable - but the North American satellite operators can pretty much do what they like on their closed platforms.
RK
Rkolsen
A number of US platforms use lower resolutions than 1920x1080 for emission (i.e. the final leg to the viewer) - with 1440x1080 (as BBC HD used to use), 1280x1080 and 1080x1080 all used I believe - so the resolution loss caused by a 50% horizontal squeeze may not be that huge.

DVC Pro HD - which was a widespread codec in use for production (as it's supported by pretty much every platform - FCP, Avid, Prem Pro, Quantel/GVG - and operating system) uses 1440x1080 in 1080i25 and 1280x1080 in 1080i29.97 too (the lower frame rate of 50Hz systems means you get a bit more resolution)

I've worked on shows where video screens are fed by 1/4 screen crops and zooms of a single quad split graphic source (rather than tying up 4 replay server ports, you use one and some re-sizers) It's fine if the screen is small in frame, or lower resolution (or defocused)

The other system that used to be used to get two video signals into one was I think CamPlex or VidiPlex(?) which sent one field of each source - so you got a 288 or 240 line feed of each source (the decoder field-doubled back to 576 or 480) with half motion.

Here the only standards used are 1920x1080 and 1280x720 and as it’s an NBC station it will be 1920x1080 (Fox and ABC are the only 720p only broadcasters). There used to be HD Lite on satellite where it was 1440x1080 and some cable companies in an effort to fit 100+ HD channels have downgraded the signals to 720p (but they cannot convert or manipulate broadcast channels on their system). The ATSC standard only supports those two for HD, but have been used for distribution. If there are multiple HD services the 1920x1080i may be downgraded to 1280x720p or SD widescreen, so those odd resolutions wouldn’t be transmittable.

Edit : However if they are using a DVE or resized for multiple boxes (not sure what it’s called when there are more than one video feed on air) it may not be noticeable.


Yep - it was Dish and/or DirecTV who use (or used) the lower horizontal resolution variants - as low as 1080x1080 I believe. People with modified receivers who could access the decrypted transport stream were able to analyse the video parameters ISTR. Things may have changed since then. AIUI the satellite platforms are allowed to re-encode the local broadcast channels they distribute - as they use h.264 rather than MPEG2 (which is still used OTA) for distribution these days ?

ATSC mandates (or kind of tried to) specific resolutions for OTA and cable - but the North American satellite operators can pretty much do what they like on their closed platforms.


Satellite providers can reencode to fit their distribution platform however they cannot change anything that would alter how it’s displayed - if it’s 1920x1080i they’d have to transmit it at the same quality of the station. The same goes for cable which up until a while you could plug in a tv without a box and tune to the stations OTA channel number and get the channel. Now that’s all encoded. Like satellite they can’t distort or change the signal.
NG
noggin Founding member
Here the only standards used are 1920x1080 and 1280x720 and as it’s an NBC station it will be 1920x1080 (Fox and ABC are the only 720p only broadcasters). There used to be HD Lite on satellite where it was 1440x1080 and some cable companies in an effort to fit 100+ HD channels have downgraded the signals to 720p (but they cannot convert or manipulate broadcast channels on their system). The ATSC standard only supports those two for HD, but have been used for distribution. If there are multiple HD services the 1920x1080i may be downgraded to 1280x720p or SD widescreen, so those odd resolutions wouldn’t be transmittable.

Edit : However if they are using a DVE or resized for multiple boxes (not sure what it’s called when there are more than one video feed on air) it may not be noticeable.


Yep - it was Dish and/or DirecTV who use (or used) the lower horizontal resolution variants - as low as 1080x1080 I believe. People with modified receivers who could access the decrypted transport stream were able to analyse the video parameters ISTR. Things may have changed since then. AIUI the satellite platforms are allowed to re-encode the local broadcast channels they distribute - as they use h.264 rather than MPEG2 (which is still used OTA) for distribution these days ?

ATSC mandates (or kind of tried to) specific resolutions for OTA and cable - but the North American satellite operators can pretty much do what they like on their closed platforms.


Satellite providers can reencode to fit their distribution platform however they cannot change anything that would alter how it’s displayed - if it’s 1920x1080i they’d have to transmit it at the same quality of the station. The same goes for cable which up until a while you could plug in a tv without a box and tune to the stations OTA channel number and get the channel. Now that’s all encoded. Like satellite they can’t distort or change the signal.


They were certainly horizontal down-sampling in the MPEG2 days https://www.engadget.com/2007/01/11/the-engadget-hd-interview-directvs-cto-re-hd-lite/

HD-Lite was the name that people who didn't like it christened it. I wasn't aware of any legislation in the US that dictated that broadcast resolution has to be maintained 1:1.
RK
Rkolsen

Yep - it was Dish and/or DirecTV who use (or used) the lower horizontal resolution variants - as low as 1080x1080 I believe. People with modified receivers who could access the decrypted transport stream were able to analyse the video parameters ISTR. Things may have changed since then. AIUI the satellite platforms are allowed to re-encode the local broadcast channels they distribute - as they use h.264 rather than MPEG2 (which is still used OTA) for distribution these days ?

ATSC mandates (or kind of tried to) specific resolutions for OTA and cable - but the North American satellite operators can pretty much do what they like on their closed platforms.


Satellite providers can reencode to fit their distribution platform however they cannot change anything that would alter how it’s displayed - if it’s 1920x1080i they’d have to transmit it at the same quality of the station. The same goes for cable which up until a while you could plug in a tv without a box and tune to the stations OTA channel number and get the channel. Now that’s all encoded. Like satellite they can’t distort or change the signal.


They were certainly horizontal down-sampling in the MPEG2 days https://www.engadget.com/2007/01/11/the-engadget-hd-interview-directvs-cto-re-hd-lite/

HD-Lite was the name that people who didn't like it christened it. I wasn't aware of any legislation in the US that dictated that broadcast resolution has to be maintained 1:1.


HD Lite was never used on broadcast, OTA channels. I'm going by one of the comments in a Comcast forum from an employee. Also, since my local Comcast headend (and most across the country) have moved to "enhanced HD" aka down converting all non broadcast channels to 1280x720p and encoding them in MPEG-4. They may have converted OTA to MPEG-4 but the resolution stands. I believe if Comcast and others could downsize broadcast affiliates to 720p, they could and would.

Newer posts