ffmpeg - Processing Media
Describe a couple of ffmpeg recipes
The following snippet uses a PHP Script (executed at the command line) to download an HDS manifest and assembles the video fragments:
- Edit
C:\php\php.ini
and uncomment theextension=php_curl.dll
line php.exe AdobeHDS.php --manifest "http://adaptiv.wdr.de/...mp4.csmil/manifest.f4m?g=...&hdcore=3.10.0&plugin=aasp-3.10.0.29.28" --delete
- Use
ffmpeg
to convert FLV to MP4 (ffmpeg -i 1.flv -vcodec copy -acodec copy -map_metadata 0 1.mp4
)
ffmpeg \
-i https://...akamaihd.net/.../name/a.mp4/index.m3u8 \
-c copy -bsf:a aac_adtstoasc "foo.mp4"
rtmpdump \
--protocol 0 \
--host cp45414.edgefcs.net \
-a "ondemand?auth=daEa9dhbhaJd4dmc8bicPd1cJdcdzcUcwcd-btFUIl-bWG-CqsEHnBqLEpGnxK&aifp=v001&slist=public/mps_h264_med/public/news/world/1078000/1078809_h264_800k.mp4;public/mps_h264_lo/public/news/world/1078000/1078809_h264_496k.mp4;public/mps_h264_hi/public/news/world/1078000/1078809_h264_1500k.mp4" \
-y "mp4:public/mps_h264_lo/public/news/world/1078000/1078809_h264_496k.mp4" \
-o someresolution.flv
ffmpeg -i someresolution.flv -c:v copy -c:a copy someresolution.mp4
rtmpdump --protocol 0 --host cp45414.edgefcs.net \
-a "ondemand?auth=daEa9dhbhaJd4dmc8bicPd1cJdcdzcUcwcd-btFUIl-bWG-CqsEHnBqLEpGnxK&aifp=v001&slist=public/mps_h264_hi/public/news/world/1078000/1078809_h264_1500k.mp4" \
-y "mp4:public/mps_h264_hi/public/news/world/1078000/1078809_h264_1500k.mp4" \
-o 1078809_h264_1500k.flv
ffmpeg -i 1078809_h264_1500k.flv -c:v copy -c:a copy 1078809_h264_1500k.mp4
#!/bin/bash
youtube-dl https://www.youtube.com/watch?v=bY73vFGhSVk
# Trim time and crop sub-part and save as mp4
ffmpeg \
-i "Zootopia Official US Sloth Trailer-bY73vFGhSVk.mp4" \
-ss 00:01:49 -t 00:00:11.3 \
-vf "crop=480:320:600:100" \
-c:v libx264 \
-c:a aac \
-strict experimental \
-b:a 128k \
"laughing sloth.mp4"
# generate color palette
ffmpeg \
-i "laughing sloth.mp4" \
-y \
-vf fps=10,scale=320:-1:flags=lanczos,palettegen palette.png
# Render GIF using palette
ffmpeg \
-i "laughing sloth.mp4" \
-i palette.png \
-filter_complex "fps=10,scale=320:-1:flags=lanczos[x];[x][1:v]paletteuse" \
output.gif
convert.ps1
dir *.webm | foreach { ffmpeg -i $_.Name -ab 192k $_.Name.Replace("WEBM", "mp3").Replace("webm", "mp3") }
dir *.mkv | foreach { ffmpeg -i $_.Name -ab 192k $_.Name.Replace("mkv", "mp3") }
dir *.mkv | foreach { ffmpeg -i $_.Name -vcodec copy -acodec copy -map_metadata 0 $_.Name.Replace("mkv", "mp4") }
dir *.mkv | foreach { ffmpeg -i $_.Name -vcodec copy -ab 192k -map_metadata 0 $_.Name.Replace("mkv", "mp4") }
ffmpeg -list_devices true -f dshow -i dummy
On my work laptop, I have an integrated web cam, a built-in microphone, and an additional head set:
C:\Users\chgeuer>ffmpeg -list_devices true -f dshow -i dummy
ffmpeg version N-69972-g6c91afe Copyright (c) 2000-2015 the FFmpeg developers
...
[dshow @ 0000000004d2d540] DirectShow video devices (some may be both video and audio devices)
[dshow @ 0000000004d2d540] "Integrated Camera"
[dshow @ 0000000004d2d540] DirectShow audio devices
[dshow @ 0000000004d2d540] "Microphone (Realtek High Definition Audio)"
[dshow @ 0000000004d2d540] "Headset Microphone (Plantronics C520-M)"
The strings
"Integrated Camera"
, "Microphone (Realtek High Definition Audio)"
and "Headset Microphone (Plantronics C520-M)"
now refer to the different usable sources. In ffmpeg, the -i
parameter usually refers to the input file. In our case, we can now combine video & audio sources to an input specification for ffmpeg:-i video="Integrated Camera":audio="Headset Microphone (Plantronics C520-M)"
-i video="Integrated Camera":audio="Microphone (Realtek High Definition Audio)"
ffmpeg -f dshow -i video="Integrated Camera":audio="Microphone (Realtek High Definition Audio)" -list_formats all
ffmpeg -f dshow -i video="Integrated Camera":audio="Headset Microphone (Plantronics C520-M)" -t 5 5-seconds.mp4
ffmpeg -i video="Integrated Camera":audio="Headset Microphone (Plantronics C520-M)" -y -nostdin -hide_banner -loglevel 0 -f v4l2 -framerate 25 -video_size 1280x720 -input_format mjpeg -c libx264 -movflags faststart -f mpegts -
Screen capture filter
For capturing the local screen, you need a driver to tap into the video card.
ffmpeg
on Windows ships with the GDI grabber -f gdigrab
filter.It is also possible to use a DirectShow filter (
-f dshow
), but then you need a driver.After installing the driver above, you will be able to use the ffmpeg input
-i video="screen-capture-recorder":audio="virtual-audio-capturer"
Write a 10 second screen capture (at 20 fps) to local MP4 file
ffmpeg -f dshow -i video="screen-capture-recorder":audio="virtual-audio-capturer" -r 20 -t 10 screen-capture.mp4
ffmpeg -f dshow -i video="screen-capture-recorder":audio="Headset Microphone (Plantronics C520-M)" -r 20 -t 10 screen-capture.mp4
Play back current screen
ffplay -f dshow -i video="screen-capture-recorder" -vf scale=1280:720
ffmpeg -list_devices true -f dshow -i dummy
ffmpeg -f dshow -i video="Integrated Camera":audio="Microphone Array (Realtek High Definition Audio)" out.mp4
ffmpeg -i "m0-01 - A.mp4" -c copy -bsf:v h264_mp4toannexb -f mpegts "m0-01 - A.ts"
ffmpeg -i "m1-01 - B.mp4" -c copy -bsf:v h264_mp4toannexb -f mpegts "m1-01 - B.ts"
("file '" + (((dir "*.ts" | select -ExpandProperty Name) -replace "'", "\'") -join "'`nfile '") + "'") | Out-File -Encoding ascii -FilePath ts.txt
file 'm0-01 - A.ts'
file 'm1-01 - B.ts'
ffmpeg -f concat -i ts.txt -c copy -bsf:a aac_adtstoasc output.mp4
dir *.mp4 | foreach { ffmpeg -i $_.Name -c copy -bsf:v h264_mp4toannexb -f mpegts $_.Name.Replace("MP4", "ts").Replace("mp4", "ts") }
("file '" + (((dir "*.ts" | select -ExpandProperty Name) -replace "'", "\'") -join "'`nfile '") + "'") | Out-File -Encoding ascii -FilePath ts.txt
ffmpeg -f concat -safe 0 -i ts.txt -c copy -bsf:a aac_adtstoasc output.mp4
After creating an Azure Media Services Live channel, we get two RTMP ingest endpoints, which differ in their TCP port number (1935 and 1936):
rtmp://channel1-mediaservice321.channel.mediaservices.windows.net:1936/live/deadbeef012345678890abcdefabcdef
rtmp://channel1-mediaservice321.channel.mediaservices.windows.net:1936/live/deadbeef012345678890abcdefabcdef
For ffmpeg to work, we need to append the channel name
/channel1
to the URLs:rtmp://channel1-mediaservice321.channel.mediaservices.windows.net:1936/live/deadbeef012345678890abcdefabcdef/channel1
rtmp://channel1-mediaservice321.channel.mediaservices.windows.net:1936/live/deadbeef012345678890abcdefabcdef/channel1
The Azure Blog now tells us to use RTMP with H.264 video and AAC audio, a 2-second key-frame interval, and CBR (constant bit rate) encoding.
misc
-y
Overwrite output files without asking-loglevel debug
(or verbose, quiet, panic, fatal)
-f dshow
use DirectShow Filter-i video="Integrated Camera":audio="Microphone (Realtek High Definition Audio)"
use internal web cam and microphone
Video output
-s 640x480
Resolution-codec:v libx264
H.264 / AVC video-pix_fmt yuv420p
pixel format YUV420-preset veryfast
(ultrafast,superfast, veryfast, faster, fast, medium, slow, slower, veryslow, placebo)-b:v 200k
target video bit rate-minrate 200k
minimum video bit rate-maxrate 200k
maximum video bit rate-r 30
frame rate-keyint_min 60
minimum GOP size-g 60
maximum GOP size-sc_threshold 0
scene change threshold-bsf:v h264_mp4toannexb
bitstream filter. Useffmpeg -bsfs
for a full list
Audio output
-codec:a libvo_aacenc
AAC audio-b:a 128k
audio bit rate-ar 44100
audio sampling frequency-ac 2
audio channels-strict experimental
overall stream
-bufsize 200k
buffer size-maxrate 200k
maximim bit rate
Destination
-f flv rtmp://chan1-acc2.channel.mediaservices.windows.net:1936/live/deadbeef/chan1
target RTMP endpoint to push to
set DEST=rtmp://channel1-mediaservice321.channel.mediaservices.windows.net:1935/live/deadbeef012345678890abcdefabcdef/channel1
set SRC=video="Integrated Camera":audio="Headset Microphone (GN 2000 USB OC)"
ffmpeg -f dshow -i %SRC% -s 640x480 -preset veryfast -codec:v libx264 -pix_fmt yuv420p -b:v 200k -minrate 200k -maxrate 200k -bufsize 200k -r 30 -g 60 -keyint_min 60 -sc_threshold 0 -codec:a aac -b:a 48k -f flv %DEST%
set VIDEOBITRATE=200k
ffmpeg -f dshow -i %SRC% -s 640x480 -preset veryslow -codec:v libx264 -pix_fmt yuv420p -pass 1 -b:v %VIDEOBITRATE% -minrate %VIDEOBITRATE% -maxrate %VIDEOBITRATE% -bufsize %VIDEOBITRATE% -r 30 -g 60 -keyint_min 60 -sc_threshold 0 -profile:v main -level 3.1 -codec:a aac -ar 44100 -b:a 96k -ac 2 -f flv %DEST%
set DEST=rtmp://channel1-mediaservice321.channel.mediaservices.windows.net:1935/live/deadbeef012345678890abcdefabcdef/channel1
set SRC="C:\Users\chgeuer\Cosmos Laundromat - First Cycle. Official Blender Foundation release.-Y-rmzh0PI3c.webm"
ffmpeg -re -f dshow -i %SRC% -s 640x480 -preset veryslow -codec:v libx264 -pix_fmt yuv420p -pass 1 -b:v %VIDEOBITRATE% -minrate %VIDEOBITRATE% -maxrate %VIDEOBITRATE% -bufsize %VIDEOBITRATE% -r 30 -g 60 -keyint_min 60 -sc_threshold 0 -profile:v main -level 3.1 -codec:a aac -ar 44100 -b:a 96k -ac 2 -f flv %DEST%
You can use the DASHPlayer or aka.ms/azuremediaplayer. Don't forget to append
(format=mpd-time-csf)
or (format=m3u8-aapl)
to the streams for DASH or HLS streaming.RTP protocol (MPEG Transport Streams) encoded MPEG-2
-f mpegts udp://127.0.0.1:10000?pkt_size=1316
-f rtp rtp://127.0.0.1:1234
- [FFMPEG for TS streaming](https://www.wowza.com/forums/content.php?213-How-to-use-FFmpeg-with-Wowza-Media-Server-(MPEG-TS))
ffmpeg -re -i %SRC% -s 640x480 -preset veryslow -codec:v libx264 -pix_fmt yuv420p -pass 1 -b:v %VIDEOBITRATE% -minrate %VIDEOBITRATE% -maxrate %VIDEOBITRATE% -bufsize %VIDEOBITRATE% -r 30 -g 60 -keyint_min 60 -sc_threshold 0 -profile:v main -level 3.1 -codec:a aac -ar 44100 -b:a 96k -ac 2 -f rtp rtp://127.0.0.1:1234
ffmpeg -v verbose
-i MysampleVideo.mp4 -strict -2
-codec:a aac -b:a 128k -ar 44100
-codec:v libx264 -b:v 400000 -bufsize 400k -maxrate 400k -preset medium
-r 30 -g 60 -keyint_min 60
-f flv rtmp://channel001-streamingtest.channel.media.windows.net:1935/live/a9bcd589da4b424099364f7ad5bd4940/mystream1
ffmpeg -threads 15 -re -i MysampleVideo.mp4
-strict experimental
-codec:a aac -ab 128k -ac 2 -ar 44100
-codec:v libx264 -s 800x600 -b:v 500k -minrate 500k -maxrate 500k -bufsize 500k
-r 30 -g 60 -keyint_min 60 -sc_threshold 0
-f flv rtmp://channel001-streamingtest.channel.media.windows.net:1935/live/a9bcd589da4b424099364f7ad5bd4940/Streams_500
-strict experimental
-codec:a aac -ab 128k -ac 2 -ar 44100
-codec:v libx264 -s 640x480 -b:v 300k -minrate 300k -maxrate 300k -bufsize 300k
-r 30 -g 60 -keyint_min 60 -sc_threshold 0
-f flv rtmp://channel001-streamingtest.channel.media.windows.net:1935/live/a9bcd589da4b424099364f7ad5bd4940/Streams_300
-strict experimental
-codec:a aac -ab 128k -ac 2 -ar 44100
-codec:v libx264 -s 320x240 -b:v 150k -minrate 150k -maxrate 150k -bufsize 150k
-r 30 -g 60 -keyint_min 60 -sc_threshold 0
-f flv rtmp://channel001-streamingtest.channel.media.windows.net:1935/live/a9bcd589da4b424099364f7ad5bd4940/Streams_150
ffmpeg -i infile.flac outfile.wav
REM http://etree.org/shnutils/shntool/
shntool.exe split -f infile.cue -t %n-%t -m /- outfile.wav
dir *.wav | foreach { ffmpeg -i $_.Name -ab 320k $_.Name.Replace("wav", "mp3") }
REM Convert FLAC to MP3 VBR
dir *.flac | foreach { ffmpeg -i $_.Name -qscale:a 1 $_.Name.Replace("flac", "mp3") }
REM Convert FLAC to MP3 320k
dir *.flac | foreach { ffmpeg -i $_.Name -ab 320k $_.Name.Replace("flac", "mp3") }
REM Create M4B from MP3 collection
ffmpeg -i "concat:01.mp3|02.mp3" -c:a libvo_aacenc -vn out.m4a
ren out.m4a out.m4b
REM Convert mp3 to m4a
dir *.mp3 | foreach { ffmpeg -i $_.Name -c:a libvo_aacenc -vn $_.Name.Replace("mp3", "m4a") }
# Convert a bunch of MP3 files to an iPod audio book
$folder = "C:\Users\Public\Music\Star Wars Episode 1 - Die dunkle Bedrohung"
Function concatenate($lines) {
$sb = New-Object -TypeName "System.Text.StringBuilder";
[void]$sb.Append("""");
[void]$sb.Append("concat:");
for ($i=0; $i -le $lines.Length; $i++) {
[void]$sb.Append($lines[$i].Name);
if ($i -le ($lines.Length - 2)) {
[void]$sb.Append("|");
}
}
[void]$sb.Append("""");
return $sb.ToString();
}
Set-Location $folder
$filename = (Get-Item $folder).Name
$inputfiles = Get-ChildItem -Filter *.mp3 | Sort-Object -Property Name
$concatenation = concatenate($inputfiles)
# ffmpeg -i "concat:01.mp3|02.mp3" -c:a libvo_aacenc -vn 1.m4a
ffmpeg -i $concatenation -c:a libvo_aacenc -vn "$filename.m4a"
# Rename-Item -Path "$filename.m4a" -NewName "$filename.m4b"
# compare two videos @see http://ianfeather.co.uk/compare-two-webpagetest-videos-using-ffmpeg/
ffmpeg -i before.mp4 -i after.mp4 -filter_complex "[0:v:0]pad=iw*2:ih[bg]; [bg][1:v:0]overlay=w" output.mp4
SET FFMPEG="c:\program files\ffmpeg\bin\ffmpeg.exe"
SET GOPSIZE=-g 25
SET GOPSIZE=
SET VIDEOBITRATE=-b:v 1500k
SET RESOLUTION=-s "960x540"
SET RESOLUTION=
REM http://www.idude.net/index.php/how-to-watermark-a-video-using-ffmpeg
SET WATERMARK= -filter_complex "overlay=main_w-overlay_w-10:main_h-overlay_h-10"
SET WATERMARK= -filter_complex "overlay=(main_w+overlay_w)/2:(main_h+overlay_h)/2"
SET WATERMARK= -vf "movie=logo2.png [watermark]; [in][watermark] overlay=main_w-overlay_w-10:main_h-overlay_h-10 [out]"
SET CODEC_MP4= -vcodec libx264 -pix_fmt yuv420p %WATERMARK% %GOPSIZE% %VIDEOBITRATE%
SET CODEC_WEBM= -vcodec libvpx -acodec libvorbis -ab 160000 -f webm %WATERMARK% %GOPSIZE% %VIDEOBITRATE%
SET CODEC_OGV= -vcodec libtheora -acodec libvorbis -ab 160000 %WATERMARK% %GOPSIZE% %VIDEOBITRATE%
SET CODEC_POSTER= -ss 00:02 -vframes 1 -r 1 -f image2 %WATERMARK%
%FFMPEG% -i %1 %CODEC_MP4% %RESOLUTION% "%~n1.mp4"
%FFMPEG% -i %1 %CODEC_WEBM% %RESOLUTION% "%~n1.webm"
%FFMPEG% -i %1 %CODEC_OGV% %RESOLUTION% "%~n1.ogv"
%FFMPEG% -i %1 %CODEC_POSTER% %RESOLUTION% "%~n1.jpg"
REM http://stackoverflow.com/questions/7333232/concatenate-two-mp4-files-using-ffmpeg
REM file '1.mp4'
REM file '2.mp4'
REM %FFMPEG% -f concat -i mylist.txt -c copy output
REM Remux MOV to MP4
ffmpeg -i input.mov -vcodec copy -acodec libvo_aacenc -map_metadata 0 result.mp4