Skip to content

Commit d95283f

Browse files
committed
Updates to rpicam-vid and particularly to the streaming examples
1 parent 3c121e6 commit d95283f

File tree

4 files changed

+117
-42
lines changed

4 files changed

+117
-42
lines changed

documentation/asciidoc/computers/camera/rpicam_configuration.adoc

+2
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,8 @@ Raspberry Pi OS recognises the following overlays in `/boot/firmware/config.txt`
3838

3939
To use one of these overlays, you must disable automatic camera detection. To disable automatic detection, set `camera_auto_detect=0` in `/boot/firmware/config.txt`. If `config.txt` already contains a line assigning an `camera_auto_detect` value, change the value to `0`. Reboot your Raspberry Pi with `sudo reboot` to load your changes.
4040

41+
If your Raspberry Pi has two camera connectors (Raspberry Pi 5 or CM4, for example), then you can specify which one you are referring to by adding `,cam0` or `,cam1` (don't add any spaces) to the `dtoverlay` that you used from the table above. If you do not add either of these, it will default to checking camera connector 1 (`cam1`). But note that for official Raspberry PI camera modules, auto-detection will correctly identify all the cameras connected to your device.
42+
4143
[[tuning-files]]
4244
==== Tweak camera behaviour with tuning files
4345

documentation/asciidoc/computers/camera/rpicam_options_vid.adoc

+3
Original file line numberDiff line numberDiff line change
@@ -132,3 +132,6 @@ Records exactly the specified number of frames. Any non-zero value overrides xre
132132

133133
Records exactly the specified framerate. Accepts a nonzero integer.
134134

135+
==== `low-latency`
136+
137+
On a Pi 5, the `--low-latency` option will reduce the encoding latency, which may be beneficial for real-time streaming applications, in return for (slightly) less good coding efficiency (for example, B frames and arithmethic coding will no longer be used).

documentation/asciidoc/computers/camera/rpicam_vid.adoc

+22-2
Original file line numberDiff line numberDiff line change
@@ -11,20 +11,32 @@ For example, the following command writes a ten-second video to a file named `te
1111
$ rpicam-vid -t 10s -o test.h264
1212
----
1313

14-
You can play the resulting file with VLC and other video players:
14+
You can play the resulting file with ffplay and other video players:
1515

1616
[source,console]
1717
----
18-
$ vlc test.h264
18+
$ ffplay test.h264
1919
----
2020

21+
[WARNING]
22+
====
23+
Older versions of vlc used to play H.264 files correctly, but recent versions do not - displaying only a few, or possibly garbled, frames. You should either use a different media player, or save your files in a more widely supported container format - such as MP4 (see below).
24+
====
25+
2126
On Raspberry Pi 5, you can output to the MP4 container format directly by specifying the `mp4` file extension for your output file:
2227

2328
[source,console]
2429
----
2530
$ rpicam-vid -t 10s -o test.mp4
2631
----
2732

33+
On Raspberry Pi 4, or earlier devices, you can save MP4 files using:
34+
35+
[source,console]
36+
----
37+
$ rpicam-vid -t 10s --codec libav -o test.mp4
38+
----
39+
2840
==== Encoders
2941

3042
`rpicam-vid` supports motion JPEG as well as both uncompressed and unformatted YUV420:
@@ -76,3 +88,11 @@ To enable the `libav` backend, pass `libav` to the xref:camera_software.adoc#cod
7688
----
7789
$ rpicam-vid --codec libav --libav-format avi --libav-audio --output example.avi
7890
----
91+
92+
==== Low latency video with the Pi 5
93+
94+
Pi 5 uses software video encoders. These generally output frames with a longer latency than the old hardware encoders, and this can sometimes be an issue for real-time streaming applications.
95+
96+
In this case, please add the option `--low-latency` to the `rpicam-vid` command. This will alter certain encoder options to output the encoded frame more quickly.
97+
98+
The downside is that coding efficiency is (slightly) less good, and that the processor's multiple cores may be used (slightly) less efficiently. The maximum framerate that can be encoded may be slightly reduced (though it will still easily achieve 1080p30).
Original file line numberDiff line numberDiff line change
@@ -1,74 +1,82 @@
11
== Stream video over a network with `rpicam-apps`
22

3-
This section describes native streaming from `rpicam-vid`. You can also use the xref:camera_software.adoc#libav-integration-with-rpicam-vid[`libav`] backend for network streaming.
3+
This section describes how to stream video over a network using `rpicam-vid`. Whilst it's possible to stream very simple formats without using `libav`, for most applications we recommend using the xref:camera_software.adoc#libav-integration-with-rpicam-vid[`libav` backend].
44

55
=== UDP
66

77
To stream video over UDP using a Raspberry Pi as a server, use the following command, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
88

99
[source,console]
1010
----
11-
$ rpicam-vid -t 0 --inline -o udp://<ip-addr>:<port>
11+
$ rpicam-vid -t 0 -n --inline -o udp://<ip-addr>:<port>
1212
----
1313

1414
To view video streamed over UDP using a Raspberry Pi as a client, use the following command, replacing the `<port>` placeholder with the port you would like to stream from:
1515

1616
[source,console]
1717
----
18-
$ vlc udp://@:<port> :demux=h264
18+
$ ffplay udp://@:<port> -fflags nobuffer -flags low_delay -framedrop
1919
----
20+
As noted previously, `vlc` no longer handles unencapsulated h264 streams.
2021

21-
Alternatively, use the following command on a client to stream using `ffplay`:
22+
In fact, support for unencapsulated h264 can generally be quite poor so it is often better to send an MPEG-2 Transport Stream instead. Making use of `libav`, this can be accomplished with:
2223

2324
[source,console]
2425
----
25-
$ ffplay udp://<ip-addr-of-server>:<port> -fflags nobuffer -flags low_delay -framedrop
26+
$ rpicam-vid -t 0 -n --codec libav --libav-format mpegts -o udp://<ip-addr>:<port>
2627
----
2728

28-
=== TCP
29-
30-
You can also stream video over TCP. To use a Raspberry Pi as a server:
29+
In this case, we can also play the stream successfully with `vlc`:
3130

3231
[source,console]
3332
----
34-
$ rpicam-vid -t 0 --inline --listen -o tcp://0.0.0.0:<port>
33+
$ vlc udp://@:<port>
3534
----
3635

37-
To view video streamed over TCP using a Raspberry Pi as a client, use the following command:
36+
=== TCP
37+
38+
You can also stream video over TCP. As before, we can send an unencapsulated h264 stream over the network. To use a Raspberry Pi as a server:
3839

3940
[source,console]
4041
----
41-
$ vlc tcp/h264://<ip-addr-of-server>:<port>
42+
$ rpicam-vid -t 0 -n --inline --listen -o tcp://0.0.0.0:<port>
4243
----
4344

44-
Alternatively, use the following command on a client to stream using `ffplay` at 30 frames per second:
45+
To view video streamed over TCP using a Raspberry Pi as a client, assuming the server is running at 30 frames per second, use the following command:
4546

4647
[source,console]
4748
----
4849
$ ffplay tcp://<ip-addr-of-server>:<port> -vf "setpts=N/30" -fflags nobuffer -flags low_delay -framedrop
4950
----
5051

51-
=== RTSP
52+
But as with the UDP examples, it is often preferable to send an MPEG-2 Transport Stream as this is generally better supported. To do this, use:
5253

53-
To use VLC to stream video over RTSP using a Raspberry Pi as a server, use the following command:
54+
[source,console]
55+
----
56+
$ rpicam-vid -t 0 -n --codec libav --libav-format mpegts -o tcp://0.0.0.0:<port>?listen=1
57+
----
58+
59+
We can now play this back using a variety of media players, including `vlc`:
5460

5561
[source,console]
5662
----
57-
$ rpicam-vid -t 0 --inline -o - | cvlc stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/stream1}' :demux=h264
63+
$ vlc tcp://<ip-addr-of-server>:<port>
5864
----
5965

60-
For the best performance on Raspberry Pi 5, use the following command instead, which adds libav to force the H264 format:
66+
=== RTSP
67+
68+
We can use VLC as an RTSP server, however, we must send it an MPEG-2 Transport Stream as it no longer understands unencapsulated h264:
6169

6270
[source,console]
6371
----
64-
$ rpicam-vid -t 0 --inline --libav-format h264 -o - | cvlc stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/stream1}' :demux=h264
72+
$ rpicam-vid -t 0 -n --codec libav --libav-format mpegts -o - | cvlc stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/stream1}'
6573
----
6674

6775
To view video streamed over RTSP using a Raspberry Pi as a client, use the following command:
6876

6977
[source,console]
7078
----
71-
$ ffplay rtsp://<ip-addr-of-server>:8554/stream1 -vf "setpts=N/30" -fflags nobuffer -flags low_delay -framedrop
79+
$ ffplay rtsp://<ip-addr-of-server>:8554/stream1 -fflags nobuffer -flags low_delay -framedrop
7280
----
7381

7482
Alternatively, use the following command on a client to stream using VLC:
@@ -78,14 +86,13 @@ Alternatively, use the following command on a client to stream using VLC:
7886
$ vlc rtsp://<ip-addr-of-server>:8554/stream1
7987
----
8088

81-
To suppress the preview window on the server, use xref:camera_software.adoc#nopreview[`nopreview`].
89+
If you want to see a preview window on the server, just drop the `-n` option (see xref:camera_software.adoc#nopreview[`nopreview`]).
8290

83-
Use the xref:camera_software.adoc#inline[`inline`] flag to force stream header information into every intra frame, which helps clients understand the stream if they miss the beginning.
91+
=== `libav` and Audio
8492

85-
=== `libav`
93+
We have already been using `libav` as the backend for network streaming. `libav` allows us to add an audio stream, so long as we're using a format - like the MPEG-2 Transport Stream - that permits audio data.
8694

87-
You can use the `libav` backend as a network streaming source for audio/video.
88-
To stream video over TCP using a Raspberry Pi as a server, use the following command, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
95+
We can take one of our previous commands, like the one for streaming an MPEG-2 Transport Stream over TCP, and simply add the `--libav-audio` option:
8996

9097
[source,console]
9198
----
@@ -101,56 +108,99 @@ $ rpicam-vid -t 0 --codec libav --libav-format mpegts --libav-audio -o "udp://<
101108

102109
=== GStreamer
103110

104-
https://gstreamer.freedesktop.org/[GStreamer] is a Linux framework for reading, processing and playing multimedia files. This section shows how to use `rpicam-vid` to stream video over a network.
111+
https://gstreamer.freedesktop.org/[GStreamer] is a Linux framework for reading, processing and playing multimedia files. We can also use it in conjunction with `rpicam-vid` for network streaming.
112+
113+
This setup uses `rpicam-vid` to output an encoded h.264 bitstream to stdout. As we've done previously, we're going to encapsulate this in an MPEG-2 Transport Stream for better downstream compatibility.
105114

106-
This setup uses `rpicam-vid` to output an encoded h.264 bitstream to stdout. Then, we use the GStreamer `fdsrc` element to receive the bitstream, and extra GStreamer elements to send it over the network. On the server, run the following command to start the stream, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
115+
Then, we use the GStreamer `fdsrc` element to receive the bitstream, and extra GStreamer elements to send it over the network. On the server, run the following command to start the stream, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
107116

108117
[source,console]
109118
----
110-
$ rpicam-vid -t 0 -n --inline -o - | gst-launch-1.0 fdsrc fd=0 ! udpsink host=<ip-addr> port=<port>
119+
$ rpicam-vid -t 0 -n --codec libav --libav-format mpegts -o - | gst-launch-1.0 fdsrc fd=0 ! udpsink host=<ip-addr> port=<port>
111120
----
112121

113-
On the client, run the following command to receive the stream, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
122+
We could of course use anything (such as vlc) as the client, and the best GStreamer clients for playback are beyond the scope of this document. However, we note that the following pipeline (with the obvious substitutions) would work on a Pi 4 or earlier device:
114123

115124
[source,console]
116125
----
117-
$ gst-launch-1.0 udpsrc address=<ip-addr> port=<port> ! h264parse ! v4l2h264dec ! autovideosink
126+
$ gst-launch-1.0 udpsrc address=<ip-addr> port=<port> ! tsparse ! tsdemux ! h264parse ! queue ! v4l2h264dec ! autovideosink
118127
----
119128

129+
For a Pi 5, replace `v4l2h264dec` by `avdec_h264`.
130+
120131
TIP: To test this configuration, run the server and client commands in separate terminals on the same device, using `localhost` as the address.
121132

122-
==== RTP
133+
==== `libcamerasrc` GStreamer element
123134

124-
To stream using RTP, run the following command on the server, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
135+
`libcamera` provides a `libcamerasrc` GStreamer element which can be used directly instead of `rpicam-vid`. To use this element, run the following command on the server, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming. On a Pi 4 or earlier device, use:
125136

126137
[source,console]
127138
----
128-
$ rpicam-vid -t 0 -n --inline -o - | gst-launch-1.0 fdsrc fd=0 ! h264parse ! rtph264pay ! udpsink host=<ip-addr> port=<port>
139+
$ gst-launch-1.0 libcamerasrc ! capsfilter caps=video/x-raw,width=640,height=360,format=NV12,interlace-mode=progressive ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1" ! 'video/x-h264,level=(string)4' ! h264parse ! mpegtsmux ! udpsink host=<ip-addr> port=<port>
129140
----
141+
On a Pi 5 you would have to replace `v4l2h264enc extra-controls="controls,repeat_sequence_header=1"` by `x264enc speed-preset=1 threads=1`.
130142

131-
To receive over RTP, run the following command on the client, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
143+
On the client we could use the same playback pipeline as we did just above, or other streaming media players.
132144

133-
[source,console]
145+
=== WebRTC
146+
147+
Streaming over WebRTC (for example, to web browsers) is best accomplished using third party software. https://github.com/bluenviron/mediamtx[MediaMTX], for example, includes native Raspberry Pi camera support which makes it easy to use.
148+
149+
To install it, download the latest version from the https://github.com/bluenviron/mediamtx/releases[releases] page. Raspberry Pi OS 64-bit users will want the "linux_arm64v8" compressed tar file (ending `.tar.gz`). Unpack it and you will get a `mediamtx` executable and a configuration file called `mediamtx.yml`.
150+
151+
It's worth backing up the `mediamtx.yml` file because it documents many Raspberry Pi camera options that you may want to investigate later.
152+
153+
To stream the camera, replace the contents of `mediamtx.yml` by:
154+
----
155+
paths:
156+
cam:
157+
source: rpiCamera
158+
----
159+
and start the `mediamtx` executable. On a browser, enter `http://<ip-addr>:8889/cam` into the address bar.
160+
161+
If you want MediaMTX to acquire the camera only when the stream is requested, add the following line to the previous `mediamtx.yml`:
134162
----
135-
$ gst-launch-1.0 udpsrc address=<ip-addr> port=<port> caps=application/x-rtp ! rtph264depay ! h264parse ! v4l2h264dec ! autovideosink
163+
sourceOnDemand: yes
136164
----
165+
Consult the original `mediamtx.yml` for additional configuration parameters that let you select the image size, the camera mode, the bitrate and so on - just search for `rpi`.
137166

167+
==== Customised image streams with WebRTC
138168

139-
If the client is not a Raspberry Pi it may have different GStreamer elements available. On an x86 device running Linux, you might run the following command instead:
169+
MediaMTX is great if you want to stream just the camera images. But what if we want to add some extra information or overlay, or do some extra processing on the images?
170+
171+
Before starting, ensure that you've built a version of `rpicam-apps` that includes OpenCV support. Check it by running
140172

141173
[source,console]
142174
----
143-
$ gst-launch-1.0 udpsrc address=<ip-addr> port=<port> caps=application/x-rtp ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
175+
$ rpicam-hello --post-process-file rpicam-apps/assets/annotate_cv.json
144176
----
177+
and looking for the overlaid text information at the top of the image.
145178

146-
==== `libcamerasrc` GStreamer element
179+
Next, paste the following into your `mediamtx.yml` file:
180+
----
181+
paths:
182+
cam:
183+
source: udp://127.0.0.1:1234
184+
----
147185

148-
`libcamera` provides a `libcamerasrc` GStreamer element which can be used directly instead of `rpicam-vid`. To use this element, run the following command on the server, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
186+
Now, start `mediamtx` and then, if you're using a Pi 5, in a new terminal window, enter:
149187

150188
[source,console]
151189
----
152-
$ gst-launch-1.0 libcamerasrc ! capsfilter caps=video/x-raw,width=1280,height=720,format=NV12 ! v4l2convert ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1" ! 'video/x-h264,level=(string)4.1' ! h264parse ! rtph264pay ! udpsink host=<ip-addr> port=<port>
190+
$ rpicam-vid -t 0 -n --codec libav --libav-video-codec-opts "profile=baseline" --libav-format mpegts -o udp://127.0.0.1:1234?pkt_size=1316 --post-process-file rpicam-apps/assets/annotate_cv.json
153191
----
192+
(On a Pi 4 or earlier device, leave out the `--libav-video-codec-opts "profile=baseline"` part of the command.)
154193

155-
and on the client we use the same playback pipeline as previously.
194+
On another computer, you can now visit the same address as before, namely `http://<ip-addr-of-pi>:8889/cam`.
156195

196+
The reason for specifying "baseline" profile on a Pi 5 is that MediaMTX doesn't support B frames, so we need to stop the encoder from producing them. On earlier devices, with hardware encoders, B frames are never generated so there is no issue. On a Pi 5 you could alternatively remove this option and replace it with `--low-latency` which will also prevent B frames, and produce a (slightly less well compressed) stream with reduced latency.
197+
198+
[NOTE]
199+
====
200+
If you notice occasional pauses in the video stream, this may be because the UDP receive buffers on the Pi (passing data from `rpicam-vid` to MediaMTX) are too small. To increase them permantently, add
201+
----
202+
net.core.rmem_default=1000000
203+
net.core.rmem_max=1000000
204+
----
205+
to your `/etc/sysctl.conf` file (and reboot or run `sudo sysctl -p`).
206+
====

0 commit comments

Comments
 (0)