10

I took me a while of hunting, but I managed to get low-latency real-time video streaming working on my RPi2.

I'm running this on my RPi2:

raspivid -t 999999 -h 720 -w 1280 -fps 25 -b 2000000 -vf -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=0.0.0.0 port=8554

and I can consume this stream on an Ubuntu 14 laptop with gstreamer like:

gst-launch-0.10 -v tcpclientsrc host=myrpi.local port=8554 ! gdpdepay ! rtph264depay ! ffdec_h264 ! autovideosink

However, I want to view this stream in a web browser (Firefox or Chrome) using the HTML5 video tag. Apparently, both Firefox and Chrome's implementation of this tag support the video format Ogg, WebM, and MP4 H.264.

If I'm understanding the raspivid/gstreamer commands correctly, it's streaming and consuming H.264 video? However, when I try to consume this via the tag with:

<!DOCTYPE html> 
<html> 
<body> 
    <video controls autoplay>
      <source src="http://rae.local:8554">
      Your browser does not support HTML5 video.
    </video>
</body> 
</html>

Firefox and Chrome say it's in an invalid format.

Is there any way to get a low-latency video stream from the RPi that works with the <video> tag? I've seen some examples of using WebRTC or FFMpeg, but these all have 20-30s of latency and don't work with raspivid.

Edit: I followed these instructions to install UV4L via a custom repo and I managed to get the uv4l server running with:

uv4l -nopreview --auto-video_nr --driver raspicam --encoding mjpeg --width 640 --height 480 --framerate 20 --hflip=yes --vflip=yes --bitrate=2000000 --server-option '--port=9090' --server-option '--max-queued-connections=30' --server-option '--max-streams=25' --server-option '--max-threads=29'

I was then able to access the web UI at http://mypi.local:9090/. It also serves a page for streaming mjpeg that works even outside the server:

<!DOCTYPE html>
<html>
    <head>
        <title>UV4L Stream</title>
        <meta charset="UTF-8">
        <meta name="viewport" content="width=device-width, initial-scale=1.0">
    </head>
    <body>
        <script>
            function errorFunction() {
                alert('Stream stopped');
            }
        </script>
        <img src="http://mypi.local:9090/stream/video.mjpeg" alt="image"  >
    </body>
</html>

However, this clearly doesn't use the HTML5 <video> tag. I then tried the WebRTC example page, and this does appear to run in the browser, but it's heavily Javascript based, so I can't tell exactly how it's implemented or if I can reuse it in my own application.

Cerin
  • 2,291
  • 8
  • 33
  • 49

3 Answers3

7

You can send raw 264 frames to a browser through websocket and decode in in javascript. Latency < 0.1s :p I wrote an opensource project in this manner, checkout https://github.com/131/h264-live-player

131
  • 171
  • 1
  • 4
1

The best and easiest way to stream to a browser is UV4L with WebRTC (much better than mjpeg). It works "out-of-the-box" with the embedded streaming server. No command sauces are required. It optionally supports audio as well. Audio and/or video can be bidirectional. The streaming server can load customized pages too. With chrome (much better than Firefox) latency can be as low as 150ms with Full HD hardware-encoded h264.

strumpet
  • 98
  • 4
0

Try my fmp4streamer project! It works without javascript, it doesn't have dependencies other than python and V4L2. It adds fragmented mp4 header to the h264 stream at server side.

soyer
  • 161
  • 5