I am trying to live stream a sequence of images generated at real time to a web page.
I am using ffmpeg with the pipe: input and streaming raw bitmaps as follows:
ffmpeg -f rawvideo -pix_fmt argb -s 1280x720 -use_wallclock_as_timestamps 1 -i pipe: ...
(Note #1: If I specify a video file as output, e.g. out.mp4, it generates a valid file)
I am then using ffmpeg to output the video in DASH format as follows:
... -i pipe: -c:v libx264 -tune zerolatency -f dash my.mpd
Then I use the following html page to display it:
<!doctype html>
<html>
<head>
<title>Dash.js Rocks</title>
<style>
video {
width: 1280px;
height: 720px;
}
</style>
</head>
<body>
<div>
<video id="videoPlayer" controls autoplay></video>
</div>
<script src="https://cdn.dashjs.org/v3.1.0/dash.all.min.js"></script>
<script>
(function(){
var url = "/my.mpd";
var player = dashjs.MediaPlayer().create();
player.initialize(document.querySelector("#videoPlayer"), url, true);
player.play();
})();
</script>
</body>
</html>
My problem is, there is a latency of 20-40 seconds from the real time image to the displayed image. And nothing I have tried gets rid of that.
(Note #2: When using ffplay my.mpd to show the video I also get the same delay)
I've tried:
-ldash 1-streaming 1-target_latency 0-index_correction 1
And many many more options over the past 2 weeks.