dbl8,701 posts since 11 Jun 2004
London London

Yes - I think the same thing may have happened on the US show last year. Odd that this can happen these days with common font standards and standardised screen pixel sizes (1920x1080 is used by both 59.94i and 50i HD standards), though mistakes with pixel and display aspect ratios do still get made.


Can someone please explain 59.94 fps to me please? I’ve never understood why it isn’t 60 fps. Was it every 60? I thought it was based on the frequency of American electricity systems. But are they 60Hz, or are they actually 59.94Hz too?


In the days of NTSC black and white (yep - unlike PAL, NTSC doesn't just mean a colour standard) - the US standard was 525 lines at precisely 60 fields per second, giving a line-rate of 15.75kHz. This was all well and good and worked fine.

However when they introduced compatible NTSC colour subcarriers in the mid-to-late 50s (*) the subcarrier that carried the colour information had to be linked to the line-rate to make things work nicely (and improve compatibility by reducing subcarrier visibility on B&W sets etc.). At 15.75kHz the chosen subcarrier frequency relationship with the line-rate threatened to interfere with the FM sound carrier at 4.5MHz that System M used for sound (effectively causing buzzing potentially), so they needed to find a way to change the colour subcarrier frequency, which meant altering the line-rate.

As the number of lines was fixed at 525 (that is baked in to the TVs really), the only way of altering the line rate was to subtly change the field-frequency (or frame-frequency). Doing so by a small amount reduced the chances of interference, but kept full compatibility with existing B&W sets (0.06Hz difference was not an issue)

So since then all '60Hz' (**) broadcast TV has been 59.94Hz and almost all film shot for TV (and even not) is shot or transferred at 23.976Hz (***) so that when 3:2 telecine it goes to 59.94Hz.

The irony is that the interference probably wouldn't have happened that much and could probably have been mitigated over time. Because of it we have 1000/1001 frame/field rates and drop-frame timecode...

(*) There was a CBS 405 line incompatible colour system that was briefly introduced into the US that was effective 405/72fps that sent R, G and B colour sequentially to deliver a 24fps colour signal. As it was incompatible it meant you had to simulcast in colour as the B&W receivers couldn't display colour broadcasts. This could use a B&W camera and B&W CRT with spinning discs in front of it (just like some DLP projectors use today) It didn't last, and the RCA compatible system soon replaced it.

(**) The original Japanese HiVision 1125 HD standard (that 1080i used today is largely derived from) WAS 60.00Hz initially and well into the 80s. There were even complex downconverters required to output a 525/59.94Hz signal (they tried to drop frames around cuts etc.). When the US adopted 1125 for HD they adopted a 59.94Hz version of it instead... There are hopes that 4320/120p will be adopted instead of 4320/119.88p...

(***) Some European film is shot and transferred at 24.000Hz as there is no 59.94Hz to worry about. Some European Blu-rays are thus 24.000p rather than 23.976p.

And just to add to noggin's explanation, here's a video about the 29.97 frame rate.
1
Richard gave kudos