I was reading a thread this morning on Linked In where a video editor was lamenting that DVDs and Blu-rays are rapidly dying, with nothing similar to take their place – except posting on-line and delivering media files on thumb drives. At which point, he asked the question: “What codecs deliver the best results for files placed on thumb drives?”
NOTE: I use the phrase “thumb drive” to mean a small, non-powered thingy that plugs into the USB port of a computer.
I’ve written a lot about codecs over the years because they are central to our ability to shoot, edit and distribute media. Just a few of these articles include:
But, our industry continues to evolve, so its time to revisit this topic. (Oh, and you’ll find the answer to this question in the Summary, at the end.)
A Codec (Compressor/Decompressor) is a mathematical algorithm that converts reality (sound and light) into binary numbers that can be stored in a computer; that’s the “Compressor” part. Then, it converts all those binary numbers back into images and sound that we can see and hear; that’s the “Decompressor” part.
Codecs exist for still images, audio and video files. Popular still image codecs include:
Popular audio codecs include:
Popular video codecs include:
NOTE: Some formats, like QuickTime, MXF and MPEG-4, are not actual codecs, but containers that hold a variety of different codecs. For example, a QuickTime movie can hold a video file using the H.264 codec and an audio file using the AIF format.
In the past, I used to enjoy keeping track of how many audio and video codecs there were. But, frankly, I gave up when the number shot past 400. While the industry is not generating as many codecs as it was in the heyday of converting from SD to HD, not a month goes by that some manufacturer or developer is not announcing some new codec.
It is safe to say that there are “a lot!”
NOTE: As a sidelight, codecs are often divided into “lossy” and “lossless.” A lossless codec preserves all the original image quality so that when an image is restored it is indistinguishable from the original. RAW high-bit-depth files are examples of a mostly “lossless” codec. (I don’t think there is a perfectly lossless video codec; the files are just too big.)
A lossy codec “throws out” visual information as part of the compression process, which means that the compressed image does not have the original quality of the source. Virtually all video codecs are lossy to some degree, however the amount of loss varies by codec.
ONE MORE CORE CONCEPT
This one is really important: All media, in order to be stored on a digital device, must use a codec. Even more important, all media stored on a digital device must be compressed.
Some video, like that shot on an iPhone is significantly compressed. Other video, like that shot on a RED or high-end Arri camera is compressed, but not to the same degree.
The reason for all this compression is that video files are HUGE and engineers are always looking for ways to make them smaller without sacrificing too much quality. For example, a single 1080p frame in a theoretical “uncompressed” format can, depending upon bit depth, exceed 100 MB! This would require 3 GB PER SECOND to play back a 30 fps sequence!! As resolutions expand, these numbers only get worse.
Without compression, we could not edit video on our computers.
The problem is that there is no free lunch. The perfect codec would provide “infinitely” high quality, “infinitely” small files with “infinitely” efficient editing performance.
Sigh… Ain’t gonna happen.
Instead, in the real world, we get to pick two, as the triangle above illustrates. We can choose codecs that create small files with reasonably high-quality, but they are not efficient to edit; requiring serious computer horsepower to edit. H.264 and H.265 are examples of this category.
Or, we can create very efficient files with great quality, but these files are not small. ProRes and DNxHD are examples here.
Or, we can create very small files that are efficient to edit, but the overall quality is poor. DV could be an example here.
There are four principal goals to consider when choosing a codec:
For instance, if you are posting a file to the Internet, the size of the file and the speed of decompression are more important than how long it takes to compress the file in the first place or the quality of the final image. That is not to say these last two are unimportant, just less important.
On the other hand, if you are streaming a live event, the speed of compression is most important, because if you can’t compress faster than real-time, no one will be able to watch the event.
As a third example, for a network television program, the speed of decompression and the quality of the final image are of paramount importance.
In the past, back in the days of SD (standard-definition video), we would shoot, edit and distribute our media using a single codec. At the professional level, that codec would be DigiBetacam. Or, one level down, DV.
For all its problems with low resolution, interlacing and converting between three different frame rates; shooting standard def video was a walk in the park compared to the mess we find ourselves in today.
What has evolved today is what I call the “Three Codec Workflow.” We shoot one codec, then transcode (convert) it to a second codec for editing, then transcode it to a third codec for distribution. That middle codec is called an “intermediate” or “mezzanine” codec that exists solely to provide a high-quality, very efficient video format that is optimized for editing.
For example, an iPhone shoots H.264 in an MPEG-4 container. That gets converted to ProRes 422 for editing in Final Cut Pro X, then uploaded to YouTube/Vimeo/Facebook as a high-bit-rate H.264 in an QuickTime container.
Or, an Arri Alexa shoots using the ARRIRAW codec. This is transcoded to GoPro Cineform for editing in Premiere, then transcoded to a DCP for distribution to a digital cinema projector in a theater.
These are only two examples, there are dozens and dozens of variations. The point is that we need to use different codecs for different parts of the production and post-production process.
ADDING TO THE CONFUSION
In the past, we could count on codecs running on both Mac and Windows systems. Those days seem to be ending.
Apple has never made ProRes writeable (recordable) on Windows, though it will play easily. Also, recently, Apple has said that it will discontinue support for QuickTime on Windows.
This lack of QuickTime support is huge, because many video editors depend upon it for their own editing; Adobe Premiere comes instantly to mind.
GoPro Cineform and DNxHD are both solid codec alternatives for Windows, but the future lack of QuickTime support does not, yet, have an immediate solution. I’m expecting to hear more about this at the annual NAB Show later this month.
MAKING THINGS STILL MORE COMPLEX
At one level, the move to higher resolutions such as 4K or higher, is really no different than shooting and editing HD – except that we need MUCH more storage, with much faster bandwidth.
But, the move to full support for Rec. 2020 (sometimes called HDR), is more than resolution. It includes:
And to achieve these three goals requires video files that use 10-bit-depth codecs or greater throughout the entire production/post/distribution process. (Currently, the highest available bit-depth is 16-bits per channel (color).)
Shooting high bit-depth video is now commonplace. Most current cameras now offer a recording option that supports 10-bit video in some form.
Editing high bit-depth video is also straightforward. All ProRes formats support 10-bit video, with the two 4444 variants supporting 12-bit. Some, but not all DNxHD formats support 10-bit video, the rest only use 8-bit. GoPro Cineform supports 10-bit, with the RGB variants supporting 12-bit. So, while we need to pick the right codec, editing in higher bit-depth ranges is possible.
The problem is that many, in fact, most popular video codecs for distribution only support 8-bit images. This is one of the big benefits of the VERY-slow-to-roll-out H.265 codec; it supports 10-bit video or greater more easily than the current implementation of H.264.
With distribution, we need to carefully pick a codec that supports the bit-depth we need. And this varies by distribution outlet. There is no current standard codec that works for most situations.
A QUICK THOUGHT ON AUDIO CODECS
Fortunately, audio codecs are much easier to work with. AIF/AIFF and WAV files are two different containers for the same audio data. Working with either one during production and post is an excellent choice as both are considered high-quality, “uncompressed” formats.
Most video cameras and audio gear record 48 kHz sample rates at 16-bith depth. This is a good choice for recording on set.
High-end audio production will often work with 92 kHz sample rates, or higher, for the same reason that video producers shoot high-resolution 4K or 6K video and downsample to HD for editing: It gives them more data to work with when creating effects and mixes.
When compressing, AAC is a better choice than MP3. And sample rates of either 44.1 kHz or 48kHz at 16-bit depth will provide audio quality that exceeds normal human hearing.
NOTE: For distribution, I use 44.1 kHz for audio-only files and 48 kHz sample rates for audio which gets synced to video; both at 16-bit-depth.
Codecs are like the engine to car. We need it to get anywhere, but most of us would rather think about something else. The problem is that, currently, NOT thinking about the codecs we are using can slow our editing, degrade our images or make our final edit undeliverable.
Making matters even more complex is that there is no single “best practices” codec. But, here are some thoughts.
And the answer to the thumb drive question at the top? Well, it depends. As long as you aren’t interested in supporting some form of HDR, compress the video using the H.264 codec, the audio as AAC, and store them both in an MPEG-4 container. Those formats will play on just about everything.
NOTE: Here’s a video that explains basic compression concepts.
However, it will surprise no one when I say that this will probably change by next year. Sigh…
17 Responses to Understanding Codecs – And Why They Are Important← Older Comments
Finally got my answer directly from cineform themselves, I guess since you do this for living this should help you larry.
VillasManzanill said: ↑
If the creators of go pro says it’s better I trusts that they know what they are talking about.
It does look better cineform than pro res.
Plus defish looks better than the plugin.
So that’s for sure what I want.
If anybody knows that would be great.
Thanks all for your suggestions.
Click to expand…
I FINALLY GOT THE ANSWER
it is possible to do what i needed but there is a bug in cineform that they didn’t fix here the email i got from cineform explaining
its possible as long as i keep my relotucion up to HD 1920 x 1080
“Okay i see what is happening. My first render that worked was HD resolution. When i tried a UHD export, I also get the lines.
This is actually a bug that i reported two years ago when we discontinued the paid versions of GoPro Studio and chose to only offer the free version. At the time, the free version only supported HD. We had intended to remove the licensing restriction from the free version to allow it to encode higher than HD. That work was never done, and like i said, i reported it as a bug and tried following up on it many times…to no avail. I don’t know if you know, but the team responsible for GoPro Studio has been let go as GoPro is focusing more on Quik and mobile apps now so no new development or bug fixes are going into GoPro Studio.
Part of the reason this work was never done has to do with Apple’s sandboxing of their products and their diminishing support for QuickTime. Apple restricts 3rd party codecs from being seen by FCP/Compressor without the use of QuickTime and they have publicly stated that QuickTime is being retired. So our engineers did not pursue fixing something in a legacy product.
What we have done, instead, is to make the codec available to 3rd parties as part of our CineForm SDK. Adobe has taken advantage of this and incorporated CineForm into their products and Blackmagic Design has been working on this as well. For you to be able to use CineForm in FCP or Compressor (in 4K) at this point, would require Apple to use our SDK and incorporate it themselves.
Sorry that this is probably not what you want to hear…its not what i wanted to have to say. But this is where we are at.
Let me know if you have any questions.
CineForm | GoPro”
Thank you for sharing this information! It was not something I was aware of.
I’m a little confused. I have a Canon 5d Mark iii, an Imac computer, and use Premier pro. I don’t see where I have any options on codec. As far as my camera goes, I can’t set the codec here. I don’t see any options when I bring the footage into my computer. I guess the only place I see an option is when I export from Premier Pro. In summary, I’m not sure any of this concerns me except the export.
The Canon 5D Mark iii shoots a video codec called “H.264.” This is a highly-compressed 8-bit codec is typical of many DSLR cameras. It requires a lot of horsepower for video editing, which Premiere supplies.
It is not ideal for color grading – being only 8-bit in depth – which means that if you are are doing lots of effects, converting it to a 10-bit codec, such as GoPro Cineform, would make more sense.
Also, H.264 takes longer to export than other codecs, than, say ProRes or GoPro Cineform.
Every video is recorded using a codec. Whether it makes sense to convert to a different codec depends upon the work you are doing, the effects you are applying, how import speed is to your workflow and what you are doing with the final output.