Why were/are TV series shot in 24fps?

Why were/are TV series shot in 24fps? - Free stock photo of appartment, at home, beautiful home

This question is mainly about pre-2000 TV series created for CRT TVs, but a lot of it still applies for HD-TV content.

So TVs have been 60Hz/59.94Hz(NTSC) ever since their inception and an old video camera would shoot 60fps interlaced to match the refresh rate. However due to the low visual quality of video, most higher budget series were shot on film at 24fps and converted via 3:2 pulldown to match the 60Hz. Examples would be StarTrek or X-Files.

I can understand why they choose to not shoot film at the full 60fps, as that would require twice the amount of film stock and more light. However what I don't understand is why they went with 24fps instead of going for 30fps.

A 30fps framerate that can be displayed accurately on 60Hz TV seems to be the more obvious choice than going with 24fps and doing 3:2 pulldown that introduces unnecessary judder.

But as far as I know, no TV series was actually shoot at 30fps. Cheap soap operas went for video and 60fps, while all the ones that went to film used 24fps. Only recently with the rise of digital cameras, Youtube and Internet streaming there is now a substantial amount of 30fps content available, though most/all(?) of the big budget productions still seem to go for 24fps.

Why choose 24fps when the content is never going to be displayed in a cinema and most TVs still don't support native 24fps playback? Has there been any 30fps content on TV prior to the introduction of digital cameras and HD-TVs?



Best Answer

There are two related reasons.

The first is that feature film production, from the 1920s onward when film cameras became standardized, was entirely designed around working at 24fps. Cinematographers learned to exercise fine control over exposure, image quality and depth of field by balancing shutter angle (how long the frame exposes--literally how wide the gap or gaps in the film camera's spinning shutter is), f-stop (the width of the iris inside the lense), film stock (faster but grainier vs. slower but cleaner) and filters (at a minimum, neutral density filters, which reduce light transmission to the film without altering color). There are some very complicated tables and a lot of rules of thumb built around balancing all these elements, and altering the frame rate throws still another problem into the mix. Meanwhile, it decreases the amount of footage that can be shot before the film magazine must be switched by 20%, and bumps up the cost of film stock by just as much. Timing, editing and sound sync processes were all designed around 24fps as the desired output. This adds up to a lot of infrastructure and know-how optimised for 24fps, in a high-pressure production environment where time is money, and small mistakes can add up to even bigger money.

But it was (and is) certainly possible to shoot at higher frame rates. It's simply not worth it, for an additional reason. If the goal is to gain a higher quality image on tv, higher frame rates can actually work against you. I went into this in an answer about the "Soap Opera Effect," but the bottom line for this question is that the "judder" introduced by duplicating those frames isn't usually very noticeable, but the increased overall "sharpness" of motion at higher frame rates is.

And now we're getting into opinion, but I believe it's partly cultural, and partly about how our lower vision system translates speed of motion into perception of safety or danger. 24fps feels more like a waking dream than a real situation. This isn't what we want for sports or videogames, but it's exactly what we want to wrap our fiction in.




Pictures about "Why were/are TV series shot in 24fps?"

Why were/are TV series shot in 24fps? - Photo of a Room with Chairs and a Sofa
Why were/are TV series shot in 24fps? - Aerial View of a Dock and Breakwater
Why were/are TV series shot in 24fps? - Green Grass Field Near Body of Water



Why are TV shows filmed at 24fps?

In the silent film era, filmmakers shot movies between 16 and 20fps, which was why the motion appeared fast and jerky. Today, filmmakers typically shoot video at a minimum of 24fps because this is believed to be the lowest frame rate required to make motion appear natural to the human eye.

Why is 24fps better than 30fps?

The main choice of 24fps was a budgetary and technology limitation issue that allowed for film companies to save money on film by shooting 24 frames vs 30 frames. The more the 24 frame rate was used in bigger budget cinema films, the more we would associate that frame rate with being more 'cinematic'.

Are TV shows shot in 30 fps?

The video look: As I mentioned before, the broadcast standard is 30 fps, so that means a lot of stuff you watch on TV, like news, sitcoms, and reality TV shows are shot at this frame rate.

Why do movies look fine at 24fps?

To be perfectly fluid it takes a lot of pages or pictures. 24FPS seems fine in videos that are shot using film reels because each frame is exposed to light over time (1/24s).



A Defense of 24 FPS and Why It's Here to Stay for Cinema




More answers regarding why were/are TV series shot in 24fps?

Answer 2

This is admittedly speculation, but one reason is probably that film cameras that shoot at 30 fps weren't readily available at the time. There's an entire industry set up for capturing and displaying film at 24 fps. So that of course still means spending money on conversion from 24P to 29.97i (or 25i for PAL). But the conversion has to happen anyway if you're shooting on film and delivering on video. So adding frame rate conversion to the step everyone is already doing is cheaper that making a new camera.

Plus, the conversion is probably done on the same hardware for many different shows, but they can't all use the same cameras at the same time. So the production companies would spend a lot more to buy new cameras for all of their productions, whereas they can buy fewer conversion machines and use them on all productions.

TL;DR - it's probably cheaper this way.

Answer 3

This question confuses three different things:

  • Filming using film cameras, at 24 frames per second
  • Filming using NTSC video cameras, which record at 59.94 fields per second, where odd-numbered fields contain the odd numbered scan lines and even, the even
  • 59.94 fields per second = 29.97 frames per second

Most TV shows in the US and Canada from the 60s through the 90s were recorded using video recorders at 29.97 fps (citation needed!) because recording direct to NTSC video is cheaper as the media can be reused if necessary and no developing is required. And film stock was intrinsically more expensive than DigiBeta tapes (say). (Note the problem in the UK that some early episodes of Dr Who from the 1960s were lost because the tapes were overwritten.)

The benefit of using film is that it has much higher resolution than NTSC, so shows shot on film can be digitized to newer platforms (which is why we can have great BluRay of movies from the 1930s but not from TV shows from the 1970s).

Sources: Stack Exchange - This article follows the attribution requirements of Stack Exchange and is licensed under CC BY-SA 3.0.

Images: cottonbro, Max Vakhtbovych, Mikhail Nilov, Mikhail Nilov