23

I got the Pi B+ and the Pi camera and am now trying to find the most efficient (low CPU) and lowest-latency configuration to stream H.264 encoded video from the camera to my home server.

I've read the following:

  1. http://pi.gbaman.info/?p=150

  2. http://blog.tkjelectronics.dk/2013/06/how-to-stream-video-and-audio-from-a-raspberry-pi-with-no-latency/comment-page-1/#comments

  3. http://www.raspberrypi.org/forums/viewtopic.php?p=464522

(All links use gstreamer-1.0 from deb http://vontaene.de/raspbian-updates/ . main.)

A lot has been done in this regard in the past years.

Originally, we had to pipe the output of raspivid into gst-launch-1.0 (see link 1).

Then (link 2) the official V4L2 driver was created which is now standard, and it allows to directly obtain the data without a pipe, using just gstreamer (see especially the post by towolf » Sat Dec 07, 2013 3:34 pm in link 2):

Sender (Pi): gst-launch-1.0 -e v4l2src do-timestamp=true ! video/x-h264,width=640,height=480,framerate=30/1 ! h264parse ! rtph264pay config-interval=1 ! gdppay ! udpsink host=192.168.178.20 port=5000

Receiver: gst-launch-1.0 -v udpsrc port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! fpsdisplaysink sync=false text-overlay=false

If I understand correctly, both ways use the GPU to do the H264 decoding, but the latter is a bit mor efficient since it doesn't need to go through the kernel another time since there's no pipe between processes involved.


Now I have some questions about this.

  1. Is the latter still the most recent way to efficiently get H264 from the camera? I've read about gst-omx, which allows gstreamer pipelines like ... video/x-raw ! omxh264enc ! .... Does this do anything different to just using video/x-h264, or might it even be more efficient? What's the difference?

  2. How do I find out what gstreamer encoding plugin is actually used when I use the video/x-h264 ... pipeline? This seems to be just specifying the format I want, as compared to the other pipeline parts, where I explicitly name the (code) component (like h264parse or fpsdisplaysink).

  3. In this reply to link 1 Mikael Lepistö mentions "I removed one unnecessary filter pass from streaming side", meaning that he cut out the gdppay and gdpdepay. What do those do? Why are they needed? Can I really strip them off?

  4. He also mentions that by specifying caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" parameters for the udpsrc at the receiving side, he's able to start/resume the streaming in the middle of the stream. What do these caps achieve, why these specific choices, where can I read more about them?

  5. When I do what's suggested in question 3 and 4 (adding the caps, dropping gdppay and gdpdepay) then my video latency becomes much worse (and seems to be accumulating, the latency increases over time, and after a few minutes the video stops)! Why could that be? I would like to get the latency I obtained with the original command, but also have the feature of being able to join the stream at any time.

  6. I've read that RTSP+RTP usually use a combination of TCP and UDP: TCP for control messages and other things that mustn't get lost, and UDP for the actual video data transmission. In the setups above, am I actually using that, or am I just using UDP only? It's a bit opaque to me whether gstreamer takes care of this or not.

I would appreciate any answer to even a single one of these questions!

nh2
  • 411
  • 1
  • 3
  • 8

4 Answers4

10

The options:

  1. raspivid -t 0 -o - | nc -k -l 1234

  2. raspivid -t 0 -o - | cvlc stream:///dev/stdin --sout "#rtp{sdp=rtsp://:1234/}" :demux=h264

  3. cvlc v4l2:///dev/video0 --v4l2-chroma h264 --sout '#rtp{sdp=rtsp://:1234/}'

  4. raspivid -t 0 -o - | gst-launch-1.0 fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=SERVER_IP port=1234

  5. gst-launch-1.0 -e v4l2src do-timestamp=true ! video/x-h264,width=640,height=480,framerate=30/1 ! h264parse ! rtph264pay config-interval=1 ! gdppay ! udpsink host=SERVER_IP port=1234

  6. uv4l --driver raspicam

  7. picam --alsadev hw:1,0

Things to consider

  • latency [ms] (with and without asking the client to want more fps than the server)
  • CPU idle [%] (measured by top -d 10)
  • CPU 1 client [%]
  • RAM [MB] (RES)
  • same encoding settings
  • same features
    • audio
    • reconnect
    • OS independent client (vlc, webrtc, etc)

Comparison:

            1    2    3    4    5    6    7
latency     2000 5000 ?    ?    ?    ?    1300
CPU         ?    1.4  ?    ?    ?    ?    ?
CPU 1       ?    1.8  ?    ?    ?    ?    ?
RAM         ?    14   ?    ?    ?    ?    ?
encoding    ?    ?    ?    ?    ?    ?    ?
audio       n    ?    ?    ?    ?    y    ?
reconnect   y    y    ?    ?    ?    y    ?
any OS      n    y    ?    ?    ?    y    ?
latency fps ?    ?    ?    ?    ?    ?    ?
Martin
  • 145
  • 4
user1133275
  • 2,216
  • 16
  • 32
8

I'm amazed there isn't more action on this thread, I've been chasing down the answer to this question for months.

I stream from a Pi Camera (CSI) to a Janus server, and I found the best pipeline is

gst-launch-1.0 v4l2src ! video/x-h264, width=$width, height=$height, framerate=$framerate/1 ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink sync=false host=$host port=$port

v4l2src uses the memory efficient bmc2835-v4l2 module and pulls hardware compressed h264 video directly. On a pi zero, gst-launch consumes between 4% & 10% cpu, streaming 1280x720 at 30fps. I am also able to resume the stream at any time, without using gdppay. Make sure you run rpi-update to get the mmal v4l2 driver. My pi is also underclocked and over_voltaged for stability, and streams uninterrupted for days, see here

top

I stumbled over a lot of the same problems that the OP had. The most frustrating was problem 5 - latency was accumulating over time, and eventually crashing the pi. The solution is the sync=false udpsink element. The gstreamer docs don't have much information about the element, just that it disables clock sync, but after a lot of tears, I discovered that I can now stream for hours without accumulating latency.

I also fought problem 4, I couldn't resume a stream or start watching after the stream began. The solution to this is config-interval, which rebroadcasts SPS and PPS frames. Using config-interval=1 packs these at every frame, I guess, which allows me to pick up a stream at any time.

I got pretty close to the same stream using the ffmpeg pipeline:

ffmpeg -f h264 -framerate $framerate -i /dev/video0 -vcodec copy -g 60 -r $framerate -f rtp rtp://$hostname:$port

but I can't resume the stream, if I refresh a page while streaming I get no stream. I assume this is because of the SPS and PPS frames. If anyone knows how to pack them with ffmpeg, I'd love to know.

btw I also use v4l2-ctl to set params, ffmpeg seems to recognize settings like width and height automatically, but for gstreamer they have to match what the hardware is producing

v4l2-ctl --set-fmt-video=width=$width,height=$height,pixelformat=4
v4l2-ctl --set-ctrl=rotate=$rotation
v4l2-ctl --overlay=1
v4l2-ctl -p $framerate
v4l2-ctl --set-ctrl=video_bitrate=4000000 //or whatever
Ben Olayinka
  • 81
  • 1
  • 2
6

The only modern way to stream H264 to a browser is with UV4L: no latency, no configuration, with optional audio, optional two-way audio/video. No magic GStreamer sauce, yet it's possible to extend its usage.

techraf
  • 4,353
  • 10
  • 32
  • 43
prinxis
  • 276
  • 3
  • 4
2

1.) h264es streaming across the network (sample only)

on server:

raspivid -v -a 524 -a 4 -a "rpi-0 %Y-%m-%d %X" -fps 15 -n -md 2 -ih -t 0 -l -o tcp://0.0.0.0:5001

on client:

mplayer -nostop-xscreensaver -nolirc -fps 15 -vo xv -vf rotate=2,screenshot -xy 1200 -demuxer h264es ffmpeg://tcp://<rpi-ip-address>:5001

2.) mjpeg streaming across the network (sample only)

on server:

/usr/local/bin/mjpg_streamer -o output_http.so -w ./www -i input_raspicam.so -x 1920 -y 1440 -fps 3

on client:

mplayer -nostop-xscreensaver -nolirc -fps 15 -vo xv -vf rotate=2,screenshot -xy 1200 -demuxer lavf http://<rpi-ip-address>:8080/?action=stream

all this even works on a RPi Zero W (configured as server)

sparkie
  • 435
  • 3
  • 9