I have a system that encodes video from a camera, and this can then be sent
to YouTube. I've written calls to the API to setup the video and test the
stream, then switch to live, and then stop. However, my encoder doesn't
support streaming as RTMP, so I use FFMpeg to convert the base H264 stream
to an RTMP stream. This works well.
However, by the time the stream hits YouTube it's a couple of seconds
(variable) delayed from when it left the camera - and I believe this is
down to FFmpeg buffering. This means that when the user, looking at the
camera picture, wants to start and stop the stream isn't when the stream
will actually start and stop on YouTube. Is there any way of a) reducing
the buffering within FFmpeg or b) controlling the buffering so that the
delay from the incoming live stream is predictable?
The only condition I have is that we cannot re-encode the stream as that
will take too much CPU time.
Current command line is:
ffmpeg -i udp://@xyz -codec copy -bsf:a aac_adtstoasc -f flv