You Need This Basic Knowledge Even for Casual Video
by https://www.facebook.com/ivortog, Ivor Rackham · Peta PixelEven if we are only shooting video occasionally, we need some information and one accessory that makes an enormous difference.
There is a fundamental difference between shooting video and stills. It’s not just that video is moving pictures. The difference is due to the need to balance frame rate and shutter speed. If you get it wrong, then the video looks bad.
A Lesson from Nature
We think that many creatures perceive time passing more slowly than we do. That is because they have much faster nervous systems. The key mechanism for this is called temporal resolution. It can be measured using the critical flicker fusion frequency (CFF). That is the rate at which a flickering light appears steady to the animal.
Humans perceive the world between 50 and 90 Hz; that’s 50-90 times per second. That’s why some people can see the flicker of older fluorescent lights oscillating at the rate of the alternating current here in the UK. Meanwhile, flies have a CFF of around 300 Hz.
Of course, we cannot truly know how the world appears to a fly. But we can surmise that a higher CFF means the creature has a faster processing speed. Therefore, the world appears to move more slowly to them, which is why they can react much faster than we can with our lower CFF, and why they are good at avoiding being swatted.
Meanwhile, some creatures see the world pass by much faster because their CFF is even slower than humans. A lower CFF means slower processing. One of the slowest of them all is the giant African Snail. It only samples at around 0.7 Hz. Therefore, everything appears to the snail to move quickly. We can assume that, as a result, we appear blurred to them when we move.
Although that works as an analogy, when recording movies, it isn’t quite the same as animals’ visual perception. Living senses are continuous, not composed of individually exposed frames stitched together, as in video.
Balancing the Two Settings
Importantly, with video, we have two settings to control how the video appears: firstly, the number of frames per second, and secondly, the shutter speed used to expose each frame. When recording video, we want the shutter speed to be about twice the frame rate.
Frames Per Second
If we shoot video at 60 frames per second (fps) and then play it back at 30 fps, everything in the frame will appear to move at half speed. At 15 frames per second, the video would play back at one-quarter normal speed.
Conversely, if we were to record at 15 fps and play it back at 60 fps, everything would speed by four times faster than normal.
Therefore, to see the scene pass by at normal speed, the frames-per-second (fps) we record should normally match the playback speed. In that way, things will move at the same speed as was recorded. So, if you recorded at, say, 60 frames per second (fps) but played it back at 30 fps, the action would appear to move at half speed. If you recorded at 30 fps and played it back at 60 fps, you would see everything twice as fast; however, you can adjust it to match the original speed, as I have done in the video above.
Shutter Speed
Shutter speed is not the same as the number of frames per second. However, it is the same as the shutter speed in still photography. In theory, if you shot at 1/100th second, you could take up to 100 frames in a second; if you shot at a ½ second exposure, you could only fit two frames into a second of footage, just as you could only fit 1 second of footage if the shutter speed were one second.
Choosing the correct Shutter speed
Let’s assume we are shooting at a shutter speed of one second. You probably know from still photography that most movement is blurred at a 1-second exposure. The same would apply to a video. Each frame would show blurred movement. It would not matter how fast you played back those frames. Yes, everything would appear to move faster, but that would not recover the detail lost in each frame. Movement in each picture would still be blurred.
Let us move to the opposite extreme. We increase the shutter speed to 1/1000 second. But we recorded just a single frame every second. When we play it back, the movement would seem jerky. That’s because 999/1000th of a second won’t be recorded. Even at 30 or 60 fps, with that shutter speed, the footage will seem jittery.
Clearly, there must be a happy medium between shutter speed and frame rate.
Balancing Frame Rate and Shutter Speed
Traditionally, cinema was filmed at 24 fps. The shutter speed was set to double that, i.e., 1/48th second. That was because film cameras used a rotating 180° shutter. At 24 fps, the shutter exposed each frame for half the frame interval, so 1/48th second. That is approximated in digital photography as 1/50s. This level of motion blur feels natural to viewers because it aligns with what decades of cinema have conditioned us to expect.
Modern digital video still adopts the same principle. However, video is generally shot at around 60 fps. We then multiply it by approximately two to determine our shutter speed, i.e., 1/125s.
Exploring Different Frame Rates
Most video media fall into a small set of standard frame rates. With some exceptions, cinema is still filmed at or very close to 24 fps. Peter Jackson’s The Hobbit trilogy was an exception. It was shot at 48 fps, known as High Frame Rate (HFR). Shooting and projecting at that higher rate reduced motion blur, strobing, and eye strain, especially in 3D. However, the visual difference was significant enough for viewers to notice. Some described the Hobbit films as looking too smooth and more like a video.
TV in Europe is shot at 25 or 50 fps. In the USA, it is 29.97 or 59.94 fps. The lower two rates were tied to the electrical supply frequencies. Now, modern screens refresh much faster, and there is an expectation that action is clearer and sharper. In sports, it is far easier to track a ball on screen at higher frame rates.
Also, people are getting used to the smoother action seen in games and want the clarity carried over to TV shows and movies. Most video games today run anywhere between 60 and 240 fps, with some titles often pushing even higher because fast reaction times are essential.
Can You See the Drawback?
One of the first things we learn as photographers is that to get a shallow depth of field, we want a wide aperture.
As I am typing this, it is bright outside. If I point my camera towards the outside world at f/2.8 with my camera’s base ISO of 200, the shutter speed is 1/4000. That is much faster than double the frame rate. As I mentioned above, if I am shooting at 60 fps, I want a shutter speed of just 1/125 second.
Moreover, when shooting to a log file (put simply, that compresses the highlights and lifts the shadows and is designed for editing), the ISO goes up to 400. Canon and Nikon’s Log files use an even higher base ISO of 800, while Sony cameras go even higher.
At those ISOs, it is impossible to film with a wide aperture on a bright day. Therefore, one needs an ND filter to reduce the amount of light coming through the lens. Most people dabbling in video will use a variable ND for that purpose. So, a variable ND filter is essential for video.
Conversely, at night, even if we increase the ISO, insufficient light may prevent us from achieving a shutter speed twice the frame rate.
A Short Video Demonstration
I have recorded a short video to demonstrate how a video’s look can change by altering the shutter speed ratio. It starts with the shutter speed being double the frame rate. In the middle section, when the shutter speed is increased, look at the water flowing off the rocks and how it is broken. Finally, the last scene was recorded where the shutter speed matches the frame rate, and the water looks blurred.