NOTE: This was originally maintained in TI E2E wiki here. It was written primarily for Windows users. Some data might be out of date - corrections are welcome.
- For Ubuntu:
sudo apt-get install mpv ffmpeg mplayer mkvtoolnix-gui mkvtoolnix x264 x265 lame
- If you need nightly builds check following PPAs:
- MPV, MPlayer and FFMpeg for Windows:
- Builds by Sherpya
- FFMpeg-only builds by Zeranoe
- Newer fork of mplayer: MPV. Supports most MPlayer's commands
- MPlayer codecs:
- Codecs Directory
- Latest Windows package: windows-all-20071007.zip
- Installation: Extract the files to
<MPLAYER_INSTALL_DIR\codecs\>
- Codecs Directory
- MP4Box: https://github.com/gpac/gpac
- Deprecated GUI options (current state unknown):
- YAMB
- MY-MP4BOX-GUI
- Deprecated GUI options (current state unknown):
- x264 - http://x264.nl/
- x265 - https://en.wikipedia.org/wiki/X265
- K-Lite Mega Codec Pack for Windows:
- K-Lite Download Page
- Download and install the MEGA pack.
- Select the "Lots of stuff with player" (Not tested recently)
- Enable DXVA option for both H.264 and VC1 during installation. (Not tested recently)
- K-Lite Download Page
- AVISynth+
- Active Github Repo
- Plugins and samples
- Documentation
- Home page
- Deprecated Linux Port
- Plugin to load encoded videos via FFMpeg
- Older deprecated version: AVISynth
- Vapoursynth
- Github
- Documentation
- Similar to AviSynth/AviSynth+ but python-based. Windows+Linux support available
- MKV utils:
- TS Muxer
- LAME
- Codecs
- Containers
- Pixels formats:
- To get a list of these, try:
ffmpeg -codecs
ffmpeg -formats
ffmpeg -pix_fmts
- MPlayer is a media player.
- Supports almost all audio/video codecs/containers in the market.
- Can also be used to play RAW YUVs and elementary codec streams as well.
- Provides several audio filters (eg: karaoke, surround, scaletempo)
- Provides several video filters (eg: smartblur, rotate, crop)
- MPlayer Manpage
- Few MPlayer commands:
-
mplayer <filename>
-
mplayer -fps 30 -nocorrect-pts <filename>
-
Resize during playback:
mplayer -x <NewWidth> -y <NewHeight> <filename>
-
De-interlace: (Linear interpolation):
mplayer <filename> -vf pp=li
-
De-interlace: (Cubic interpolation):
mplayer <filename> -vf pp=ci
-
Other video filter:
mplayer -pphelp
-
Loop playback (0 = inf):
mplayer <filename> -loop <loopCount>
-
Start from an offset:
mplayer -ss <offset in seconds or HH:MM:SS> <input>
-
Dumping Elementary streams:
mplayer <filename> -dumpvideo -dumpfile <videoStream.dump>
mplayer <filename> -dumpaudio -dumpfile <audioStream.dump>
-
Playing RAW YUVs:
mplayer <input_320x240_i420.yuv> -demuxer rawvideo -rawvideo w=320:h=240:fps=30:format=i420
- To see supported formats use "format=help"
-
Video Filters:
- Get list of available filters:
mplayer -vf help
- Example: Using the
crop
filter. You can follow similar steps to use other filters.- Get help:
mplayer -vf crop=help
mplayer input_1280x720.avi -vf crop=w=640:h=360
- This shall crop the center 640x360 region of the original 1280x720 video.
mplayer input_1280x720.avi -vf crop=w=640:h=480:x=0:y=360
- This shall crop the top-right 640x360 region of the original 1280x720 video.
- Get help:
- Get list of available filters:
-
Getting Screenshots:
mplayer <inputFile> -vf screenshot
: Hit "s" to take a screenshot.mplayer <inputFile> -vo png
: This will dump .png for all frames.mplayer -vo help
for other supported formats
-
Controlling window location and output modes:
- Specifying window location (x,y) in pixels:
mplayer <inputFile> -geometry <x>:<y>
- Specifying window location (x,y) in pixels:
-
Playing multiple videos simultaneously in windows (Not tested recently)
mplayer <inputFile> -vo direct3d
- If the default value of
vo
isdirectx
one video will show only green frames
-
- MPV is a fork of MPlayer - so supports most of its commands albeit with some minor tweaks.
- MPV manpage
- Few MPlayer commands:
mpv <filename>
mpv -fps 30 --no-correct-pts <filename>
- FFMpeg is primarily a transcoder.
- FFMpeg libraries are used in MPlayer and MPV
- Documentation
- List of supported containers:
ffmpeg -formats
- List of supported codecs:
ffmpeg -codecs
- List of supported YUV formats:
ffmpeg -pix_fmts
- Typical usage:
ffmpeg <inputInfo> -i <inputFile> <encodeOptions> <outputFile>
- Controlling number of frames to process:\
ffmpeg -vframes 100 <inputInfo> -i <inputFile> <encodeOptions> <outputFile>
- Get codec information for a file:
ffmpeg -i <inputFile>
- Extracting YUV:
ffmpeg -i <inputFile> -vcodec rawvideo <output.yuv>
- Resizing YUV:
ffmpeg -s 320x240 -i <input_320x240.yuv> -s 640x480 <output_640x480.yuv>
- Changing YUV formats and resize in same command:
ffmpeg -s 320x240 -pix_fmt nv12 -i <input_320x240_nv12.yuv> -s 320x240 -pix_fmt yuv420p <output_320x240_yuv420p.yuv>
- Giving start offsets while processing:
ffmpeg -ss <offset in seconds or HH:MM:SS> -i <input> -vcodec copy <output>
- Extracting elementary streams:
- Identify required elementary stream format from supported formats using
ffmpeg -formats
- To extract:
ffmpeg -i <input> -vcodec copy -f <format> <output.format>
- Example 1: Extracting elementary H.264 from an AVI
ffmpeg -formats
gives "format = h264"ffmpeg -i input.avi -vcodec copy -vbsf h264_mp4toannexb -f h264 output.h264
- Example 2: Extracting elementary MPEG4 from an AVI
ffmpeg -formats
gives "format = m4v"ffmpeg -i input.avi -vcodec copy -f m4v output.m4v
- Example 3: Get bmp/png/jpg for frames:
ffmpeg -ss 00:10:00 -vframes 10 -i <inputFile> output?d.bmp
- This will dump bitmaps for ten frames from the time stamp 00:10:00 into files name output01.bmp, output02.bmp and so on....
- You can get .png, .jpg, .gif similarly.
- Note that lossy codecs like jpeg will involve re-encoding and loss in quality. You can add "qmax=10" or any appropriate value to control the quality.
- Identify required elementary stream format from supported formats using
- Filters:
- The following applies to all codecs/YUVs/elementary streams/containers as long as the resultant data is supported by the same.
- Video Cropping:
- Older versions:
ffmpeg -i input_320x240.avi -cropright 10 -cropleft 10 -croptop 10 -cropbottom 10 output\_300x220.avi
- Newer versions:
ffmpeg -i input\_320x240.avi -vf crop=<w>:<h>:<x>:<y> output_300x220.avi
w
= Output widthh
= Output heightx
= X co-ordinate of output image in the input imagey
= Y co-ordinate of output image in the input image
- Older versions:
- Video Padding:
- Older versions:
ffmpeg -i input\_320x240.avi -padcolor <RRGGBB> -padright 10 -padleft 10 -padtop 10 -padbottom 10 output_340x260.avi
<RRGGBB[AA]>
is in HEX.
- Newer versions:
ffmpeg -i input_320x240.avi -vf pad=<w>:<h>:<x>:<y>:<c> output_340x260.avi
w
= Output widthh
= Output heightx
= X co-ordinate of input image in the output imagey
= Y co-ordinate of input image in the output imagec
= Padding color in <RRGGBB[AA]> format.
- Older versions:
- Decoding compressed streams:
- FFMPEG:
ffmpeg -i <inputFile> -vcodec rawvideo <output.yuv>
- MPlayer:
mplayer <inputFile> -vo yuv4mpeg
- This generates "stream.yuv" which is raw data inside a YUV4MPEG header which is used by MJPEG tools.
- To convert this into raw data only, use ffmpeg as mentioned above.
- FFMPEG:
- Playing RAW YUVs:
mplayer <input_320x240_i420.yuv> -demuxer rawvideo -rawvideo w=320:h=240:fps=30:format=i420
- Resizing YUV:
ffmpeg -s 320x240 -i <input_320x240.yuv> -s 640x480 <output_640x480.yuv>
- Changing YUV formats:
ffmpeg -vframes 100 -s 320x240 -pix_fmt nv12 -i <input_320x240_nv12.yuv> -s 320x240 -pix_fmt yuv420p <output_320x240_yuv420p.yuv>
(vframes
limits number of frames processed) - Giving start offsets while processing:
ffmpeg -r 1 -ss 10 -i input.yuv -vcodec copy -vframes 15 output.yuv
- In above example, since fps=1 and offset=10, processing shall start from 10th frame and extract 15 frames.
- Interlaced YUVs
- Converting field interleaved to field separate format
mplayer input.yuv -demuxer rawvideo -rawvideo w=352:h=288 -vf tfields=0 -vo yuv4mpeg -fps 1000000
- Note that stream.yuv generated here contains a header.
ffmpeg -i stream.yuv output.yuv
to convert to RAW YUV without headers- 100 frames of 352x288 will give 200 frames of 352x144
- Merging field separate to field interleaved format
mplayer input.yuv -demuxer rawvideo -rawvideo w=352:h=144 -vf tinterlace -vo yuv4mpeg -fps 1000000
ffmpeg -i stream.yuv output.yuv
- 200 frames of 352x144 will give 100 frames of 352x288
- Extracting individual fields using mplayer
mplayer input.yuv -demuxer rawvideo -rawvideo w=352:h=288 -vf field=\<n\> -vo yuv4mpeg -fps 1000000
- Even value for n gives even fields. Odd value for n gives odd fields.
ffmpeg -i stream.yuv output.yuv
- 100 frames of 352x288 will give 100 frames of 352x144
- Swap top and bottom fields
mplayer <inputFile> -vf il=s
- Converting field interleaved to field separate format
- Overlay frame numbers on YUV
- If you are on Windows, use ShowFrameNumber() option available in [Avisynth].
- If Avisynth is not available, create test.srt in MicroDVD subtitle
format.
- Sample line in test.srt: {1}{1}1. This means the subtitle for frame 1 is "1".
- Sample line in test.srt: {1}{4}1. This means the subtitle for frames 1 to 4 is "1".
- Create test.srt with entries for sufficient number of frames.
mplayer input352x288.yuv -demuxer rawvideo -rawvideo w=352:h=288:fps=1 -fps 100000 -sub test.srt -subdelay 0 -vo yuv4mpeg
ffmpeg -y -i stream.yuv output352x288_withFrameNumOverlay.yuv
- First couple of frame numbers do not get overlayed. Send me a solution if you have one.
- The fps and subdelay options seemed to reduce the number of frames not getting overlayed at the beginning. Not really surewhy.
- Tools and supported formats:
- FFMPEG
- YUV formats:
ffmpeg -pix_fmts
- Codecs:
ffmpeg -codecs
- Containers:
ffmpeg -formats
- YUV formats:
- MPlayer
- Available RAW/compressed video codecs:
mplayer -vc help
- Available RAW/compressed video codecs:
- MP4Box
- RAW formats and containers:
mp4box -h format
- RAW formats and containers:
- FFMPEG
- Extracting elementary streams:
- Suggested tools for different formats:
- .avi with any codec: FFMpeg
- .mov, .3gp , .mp4 with MPEG4/H.264: MP4Box
- For FFMPEG, use
-vbsf h264_mp4toannexb -f h264 -vcodec copy
to dump elementary stream with start codes. - In case the above suggestions don't work, try out other tools. eg: For .3gp with H.264, START CODES are not dumped. However, MP4Box dumps proper elementary stream.
- Another workaround is to convert the input into an AVI using FFMPEG and then extract elementary using FFMPEG.
- FFMpeg
- Identify required elementary stream format from supported formats using
ffmpeg -formats
- To extract:
ffmpeg -i <input> -vcodec copy -f <format> <output.format>
- Example 1: Extracting elementary H.264 from an AVI
ffmpeg -i input.avi -vcodec copy -vbsf h264_mp4toannexb -f h264 output.h264
- Example 2: Extracting elementary MPEG4 from an AVI
ffmpeg -i input.avi -vcodec copy -f m4v output.m4v
- Identify required elementary stream format from supported formats using
- MPlayer:
mplayer <filename> -dumpvideo -dumpfile <videoStream.dump>
mplayer <filename> -dumpaudio -dumpfile <audioStream.dump>
- MP4Box:
- Get track-IDs to dump using
mp4box -info input.mp4
- Dump required track using
mp4box -raw <TrackId> input.mp4
- Get track-IDs to dump using
- Suggested tools for different formats:
- Creating AVI/MP4/other container format files from elementary stream
- FFMpeg
ffmpeg -i <elementaryStream> -vcodec copy output.avi
ffmpeg -i \<elementaryStream\> -vcodec copy output.mp4
- MP4Box
mp4box -add input.m4v output.mp4
mp4box -add input.m4v input.aac output.mp4
- Remember to delete the output file before running the command. MP4Box appends to existing files by default.
- FFMpeg
- (FOURCC == FOUR Character Code) is used to identify a video codec inside an AVI container.
- fourcc.org
- Wikipedia\
- Changing FOURCC:
ffmpeg -i <inputFile> -vcodec copy -acodec copy -vtag <XXXX> output_FOURCC_XXXX.avi
<XXXX>
should be the required FOURCC code in the avi.- Modify FOURCC tags to force PC to use different decoders.
- Example 1:
- In case of MPEG4 stream, FOURCC can be XVID or DIVX. Default is FMP4.
- If FOURCC=XVID, and you have an XVID decoder installed, and you try to play the avi in Windows Media Player, the XVID decoder is used.
- If FOURCC=DIVX, and you have an DIVX decoder installed, and you try to play the avi in Windows Media Player, the DIVX decoder is used.
- If FOURCC=FMP4, and you have installed K-Lite Codec Pack, and you try to play the avi in Windows Media Player, ffdshow decoder is used.
- GUI alternative: Nic's Mini AviC FourCC Changer - comes with K-Lite Codec Pack (Not tested recently)
- https://trac.ffmpeg.org/wiki/Encode/H.264
- https://trac.ffmpeg.org/wiki/Encode/H.265
- Maybe outdated (Use with caution)
- Specifying bitrates for encoding
- Example: To encode at 1mbps (million bits per second) with deviation of upto 0.5mbps:
ffmpeg -i <input> -vb 1000000 -bt 500000 <output>
- Specifying codecs
- Default video codec: MPEG4.
- Default audio codec: MP2.
- Example: To encode using h264 and aac:
ffmpeg -i <input> -vcodec libx264 -acodec ac3 <output>
- Faster encoding speeds
- Example: Typically keep number of threads equal to number of CPU cores you have.
ffmpeg -i <input> -threads 8 <output>
- Example: Typically keep number of threads equal to number of CPU cores you have.
- Single Pass Encoding
ffmpeg -i <input> -vcodec libx264 -acodec ac3 -fpre <PathToPresetFile> <output>
- Two Pass Encoding
- 1st Pass:
ffmpeg -pass 1 -i <input> -vcodec libx264 -acodec ac3 -fpre <PathToFirstPassPresetFile> <output>
- 2nd Pass:
ffmpeg -pass 2 -i <input> -vcodec libx264 -acodec ac3 -fpre <PathToNormalPresetFile> <output>
- 1st Pass:
- MKV stands for Matroska ideo. Other related xtensions are .mka and .mks for audio and subtitles. This is a multimedia container format just like .AVI or .MP4.
- Documentation
- Useful features like adding subtitles track, split files into mutiple parts without re-encoding.
- Download the latest version of TsMuxer with GUI.
- Supported formats: .TS, .M2TS, Blu-ray, AVCHD. Demuxing is alsosupported
- Most input file formats (avi, mkv, etc...) are supported.
- Note that there is some issue with very long file names. If your file is not detected, try reducing the length of the file path.
- https://trac.ffmpeg.org/wiki/Encode/MP3
- https://trac.ffmpeg.org/wiki/Encode/AAC
- Removing audio in a file
ffmpeg -i <inputFile> -an -vcodec copy <outputFile>
- PS3 Media Server
- Not maintained since 2016
- This shall create a media server on you PC which is on same network as your PS3.
- This effectively removes file size limitation of 4GB (due to FAT32).
- Also all containers avi, mp4, flv, mkv can be played without re-encoding. The application will re-mux the stream into a compatible container.
- If a file is not supported by PS3, it has a transcode option which will stream media files in PS3 compatible formats.
- If you still want to re-encode yourself: PS3 H.264 Conversion Guide
- Some known limitations of PS3 (when playing without the media server):
- H.264 is supported only upto Level 4.1
- .mp4 files generated by FFMPEG are not playable directly. Use MP4Box to generate the .mp4 files.
- Use AVISynth+ instead of AVISynth always to avoid 32bit constraints.
- AviSynth is a powerful tool for video post-production. It provides ways of editing and processing videos. AviSynth works as a frameserver, providing instant editing without the need for temporary files.
- AVISynth handles only RAW picture data. However, you can use compressed videos as source in an .avs script using certain filters.
- AVISynth in itself has no GUI and provides only a DLL that other applications can call. After installing avisynth, you can play .avs scripts in MPlayer/VLC/any other media player. You can also see the script as an input to FFMpeg.
- AVISynth filters
- A filter here refers to a processing functionality provided by AVISynth using its Internal filters or External filters/plugins.
- To install external filters, just extract the provided DLLs into the plugins directory under AVISYNTH's install directory.
- Note that
avisynth.dll
is kept inC:\WINDOWS\SYSTEM32\
directory, but plugins are present in<AVISYNTH_INSTALL_DIR>\plugins
- Sample script (Separating odd and even fields and displaying separately):
- Create a
sample.avs
file with the following lines. ChangeF:\sample.avi
to a valid filename.V=DirectShowSource("F:\sample.avi")
V=AssumeFrameBased(V)
V=AssumeTFF(V)
VS=SeparateFields(V)
VTop=SelectEven(VS)
VBot=SelectOdd(VS)
stackvertical(VTop,VBot)
- You can play this file in MPlayer using the command
mplayer sample.avs
- The output will show the original video in the top 2/3 of the output. The bottom 1/3 will show the video one field at a time. As a result the lower part will take twice as long to complete.
- The AssumeTFF filter shall indicate top field to be displayed first.
- You can use encode this output using ffmpeg by using the command
ffmpeg -i sample.avs output.avi
. You might need to add other options to control the codec/bitrate ofoutput.avi
.
- Create a
- Sample script (Displaying two videos side by side):
- Create a sample.avs file with the following lines. Change
F:\sample1.avi
&F:\sample2.avi
to valid filenames.V1=DirectShowSource("F:\sample1.avi")
V2=DirectShowSource("F:\sample2.avi")
stackhorizontal(V1,V2)
- Create a sample.avs file with the following lines. Change
- Commonly useful filters:
RawSource
- Read raw dataDirectShowSource
- Read compressed dataFFmpegSource2
- Read compressed data (alternative)ShowSMPTE
,ShowFrameNumber
- show timestampsAssumeFieldBased
,AssumeFieldBased
- Use data as frames/fieldsSelectOdd
,SelectEven
- Select fields and separately processSeparateFields
,Weave
- Interleave/De-interleave fieldsStackVertical
,StackHorizontal
- Create mosaicsResize
,AddBorders
,Crop
,LetterBox