All posts

Handling Transparency in FFmpeg

Transparency in video is stored in the alpha channel. Most formats ignore it. FFmpeg reads and writes alpha if the codec and container support it. That means you must pick the right combination. For example, use -c:v qtrle with a .mov file, or -c:v png for image sequences. WebM with VP9 (-c:v libvpx-vp9 -pix_fmt yuva420p) supports alpha for web delivery. Start by checking the source. Run: ffmpeg -i input.mov Look for rgba or yuva420p in the pixel format. If you don’t see alpha, it’s gone. N

Free White Paper

Just-in-Time Access: The Complete Guide

Architecture patterns, implementation strategies, and security best practices. Delivered to your inbox.

Free. No spam. Unsubscribe anytime.

Transparency in video is stored in the alpha channel. Most formats ignore it. FFmpeg reads and writes alpha if the codec and container support it. That means you must pick the right combination. For example, use -c:v qtrle with a .mov file, or -c:v png for image sequences. WebM with VP9 (-c:v libvpx-vp9 -pix_fmt yuva420p) supports alpha for web delivery.

Start by checking the source. Run:

ffmpeg -i input.mov 

Look for rgba or yuva420p in the pixel format. If you don’t see alpha, it’s gone. No filter can restore it.

To keep transparency during processing, avoid filters that drop the alpha channel. Many filters ignore alpha unless told otherwise. The format and alphamerge filters control alpha explicitly. Example:

ffmpeg -i input.mov -vf "format=rgba"-c:v qtrle output.mov

This keeps the alpha as RGBA pixels.

Continue reading? Get the full guide.

Just-in-Time Access: Architecture Patterns & Best Practices

Free. No spam. Unsubscribe anytime.

Compositing is possible directly in FFmpeg. Overlay one video with alpha over another:

ffmpeg -i background.mp4 -i overlay.mov -filter_complex "[0][1]overlay"output.mp4

Use -pix_fmt yuva420p if you want to keep the alpha in the output instead of flattening it.

Performance matters. Encoding with alpha can be slower because FFmpeg processes more data per frame. The pixel format defines memory use. RGBA and yuva420p have four channels. Choose hardware acceleration when possible. Some GPUs handle transparency in h264, but most h264 streams discard alpha.

Automation makes this repeatable. Script with FFmpeg CLI or wrap it in Python using subprocess. Pass exact flags so you don’t strip transparency by accident. Build pipelines that test pixel format after each step. Error early, save time.

Transparency processing is precise work. Every codec, every flag, every filter matters. FFmpeg will do exactly what you tell it—no more, no less.

Test your workflow live on hoop.dev. Push your FFmpeg pipeline to production in minutes and see transparency handled end-to-end without guesswork.

Get started

See hoop.dev in action

One gateway for every database, container, and AI agent. Deploy in minutes.

Get a demoMore posts