You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: documentation/asciidoc/computers/camera/rpicam_configuration.adoc
+2
Original file line number
Diff line number
Diff line change
@@ -38,6 +38,8 @@ Raspberry Pi OS recognises the following overlays in `/boot/firmware/config.txt`
38
38
39
39
To use one of these overlays, you must disable automatic camera detection. To disable automatic detection, set `camera_auto_detect=0` in `/boot/firmware/config.txt`. If `config.txt` already contains a line assigning an `camera_auto_detect` value, change the value to `0`. Reboot your Raspberry Pi with `sudo reboot` to load your changes.
40
40
41
+
If your Raspberry Pi has two camera connectors (Raspberry Pi 5 or CM4, for example), then you can specify which one you are referring to by adding `,cam0` or `,cam1` (don't add any spaces) to the `dtoverlay` that you used from the table above. If you do not add either of these, it will default to checking camera connector 1 (`cam1`). But note that for official Raspberry PI camera modules, auto-detection will correctly identify all the cameras connected to your device.
Copy file name to clipboardexpand all lines: documentation/asciidoc/computers/camera/rpicam_options_vid.adoc
+3
Original file line number
Diff line number
Diff line change
@@ -132,3 +132,6 @@ Records exactly the specified number of frames. Any non-zero value overrides xre
132
132
133
133
Records exactly the specified framerate. Accepts a nonzero integer.
134
134
135
+
==== `low-latency`
136
+
137
+
On a Pi 5, the `--low-latency` option will reduce the encoding latency, which may be beneficial for real-time streaming applications, in return for (slightly) less good coding efficiency (for example, B frames and arithmethic coding will no longer be used).
Copy file name to clipboardexpand all lines: documentation/asciidoc/computers/camera/rpicam_vid.adoc
+22-2
Original file line number
Diff line number
Diff line change
@@ -11,20 +11,32 @@ For example, the following command writes a ten-second video to a file named `te
11
11
$ rpicam-vid -t 10s -o test.h264
12
12
----
13
13
14
-
You can play the resulting file with VLC and other video players:
14
+
You can play the resulting file with ffplay and other video players:
15
15
16
16
[source,console]
17
17
----
18
-
$ vlc test.h264
18
+
$ ffplay test.h264
19
19
----
20
20
21
+
[WARNING]
22
+
====
23
+
Older versions of vlc used to play H.264 files correctly, but recent versions do not - displaying only a few, or possibly garbled, frames. You should either use a different media player, or save your files in a more widely supported container format - such as MP4 (see below).
24
+
====
25
+
21
26
On Raspberry Pi 5, you can output to the MP4 container format directly by specifying the `mp4` file extension for your output file:
22
27
23
28
[source,console]
24
29
----
25
30
$ rpicam-vid -t 10s -o test.mp4
26
31
----
27
32
33
+
On Raspberry Pi 4, or earlier devices, you can save MP4 files using:
34
+
35
+
[source,console]
36
+
----
37
+
$ rpicam-vid -t 10s --codec libav -o test.mp4
38
+
----
39
+
28
40
==== Encoders
29
41
30
42
`rpicam-vid` supports motion JPEG as well as both uncompressed and unformatted YUV420:
@@ -76,3 +88,11 @@ To enable the `libav` backend, pass `libav` to the xref:camera_software.adoc#cod
76
88
----
77
89
$ rpicam-vid --codec libav --libav-format avi --libav-audio --output example.avi
78
90
----
91
+
92
+
==== Low latency video with the Pi 5
93
+
94
+
Pi 5 uses software video encoders. These generally output frames with a longer latency than the old hardware encoders, and this can sometimes be an issue for real-time streaming applications.
95
+
96
+
In this case, please add the option `--low-latency` to the `rpicam-vid` command. This will alter certain encoder options to output the encoded frame more quickly.
97
+
98
+
The downside is that coding efficiency is (slightly) less good, and that the processor's multiple cores may be used (slightly) less efficiently. The maximum framerate that can be encoded may be slightly reduced (though it will still easily achieve 1080p30).
This section describes native streaming from `rpicam-vid`. You can also use the xref:camera_software.adoc#libav-integration-with-rpicam-vid[`libav`] backend for network streaming.
3
+
This section describes how to stream video over a network using `rpicam-vid`. Whilst it's possible to stream very simple formats without using `libav`, for most applications we recommend using the xref:camera_software.adoc#libav-integration-with-rpicam-vid[`libav` backend].
4
4
5
5
=== UDP
6
6
7
7
To stream video over UDP using a Raspberry Pi as a server, use the following command, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
To view video streamed over UDP using a Raspberry Pi as a client, use the following command, replacing the `<port>` placeholder with the port you would like to stream from:
As noted previously, `vlc` no longer handles unencapsulated h264 streams.
20
21
21
-
Alternatively, use the following command on a client to stream using `ffplay`:
22
+
In fact, support for unencapsulated h264 can generally be quite poor so it is often better to send an MPEG-2 Transport Stream instead. Making use of `libav`, this can be accomplished with:
Alternatively, use the following command on a client to stream using VLC:
@@ -78,14 +86,13 @@ Alternatively, use the following command on a client to stream using VLC:
78
86
$ vlc rtsp://<ip-addr-of-server>:8554/stream1
79
87
----
80
88
81
-
To suppress the preview window on the server, use xref:camera_software.adoc#nopreview[`nopreview`].
89
+
If you want to see a preview window on the server, just drop the `-n` option (see xref:camera_software.adoc#nopreview[`nopreview`]).
82
90
83
-
Use the xref:camera_software.adoc#inline[`inline`] flag to force stream header information into every intra frame, which helps clients understand the stream if they miss the beginning.
91
+
=== `libav` and Audio
84
92
85
-
=== `libav`
93
+
We have already been using `libav` as the backend for network streaming. `libav` allows us to add an audio stream, so long as we're using a format - like the MPEG-2 Transport Stream - that permits audio data.
86
94
87
-
You can use the `libav` backend as a network streaming source for audio/video.
88
-
To stream video over TCP using a Raspberry Pi as a server, use the following command, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
95
+
We can take one of our previous commands, like the one for streaming an MPEG-2 Transport Stream over TCP, and simply add the `--libav-audio` option:
https://gstreamer.freedesktop.org/[GStreamer] is a Linux framework for reading, processing and playing multimedia files. This section shows how to use `rpicam-vid` to stream video over a network.
111
+
https://gstreamer.freedesktop.org/[GStreamer] is a Linux framework for reading, processing and playing multimedia files. We can also use it in conjunction with `rpicam-vid` for network streaming.
112
+
113
+
This setup uses `rpicam-vid` to output an encoded h.264 bitstream to stdout. As we've done previously, we're going to encapsulate this in an MPEG-2 Transport Stream for better downstream compatibility.
105
114
106
-
This setup uses `rpicam-vid` to output an encoded h.264 bitstream to stdout. Then, we use the GStreamer `fdsrc` element to receive the bitstream, and extra GStreamer elements to send it over the network. On the server, run the following command to start the stream, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
115
+
Then, we use the GStreamer `fdsrc` element to receive the bitstream, and extra GStreamer elements to send it over the network. On the server, run the following command to start the stream, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
On the client, run the following command to receive the stream, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
122
+
We could of course use anything (such as vlc) as the client, and the best GStreamer clients for playback are beyond the scope of this document. However, we note that the following pipeline (with the obvious substitutions) would work on a Pi 4 or earlier device:
For a Pi 5, replace `v4l2h264dec` by `avdec_h264`.
130
+
120
131
TIP: To test this configuration, run the server and client commands in separate terminals on the same device, using `localhost` as the address.
121
132
122
-
==== RTP
133
+
==== `libcamerasrc` GStreamer element
123
134
124
-
To stream using RTP, run the following command on the server, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
135
+
`libcamera` provides a `libcamerasrc` GStreamer element which can be used directly instead of `rpicam-vid`. To use this element, run the following command on the server, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming. On a Pi 4 or earlier device, use:
On a Pi 5 you would have to replace `v4l2h264enc extra-controls="controls,repeat_sequence_header=1"` by `x264enc speed-preset=1 threads=1`.
130
142
131
-
To receive over RTP, run the following command on the client, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
143
+
On the client we could use the same playback pipeline as we did just above, or other streaming media players.
132
144
133
-
[source,console]
145
+
=== WebRTC
146
+
147
+
Streaming over WebRTC (for example, to web browsers) is best accomplished using third party software. https://github.com/bluenviron/mediamtx[MediaMTX], for example, includes native Raspberry Pi camera support which makes it easy to use.
148
+
149
+
To install it, download the latest version from the https://github.com/bluenviron/mediamtx/releases[releases] page. Raspberry Pi OS 64-bit users will want the "linux_arm64v8" compressed tar file (ending `.tar.gz`). Unpack it and you will get a `mediamtx` executable and a configuration file called `mediamtx.yml`.
150
+
151
+
It's worth backing up the `mediamtx.yml` file because it documents many Raspberry Pi camera options that you may want to investigate later.
152
+
153
+
To stream the camera, replace the contents of `mediamtx.yml` by:
154
+
----
155
+
paths:
156
+
cam:
157
+
source: rpiCamera
158
+
----
159
+
and start the `mediamtx` executable. On a browser, enter `http://<ip-addr>:8889/cam` into the address bar.
160
+
161
+
If you want MediaMTX to acquire the camera only when the stream is requested, add the following line to the previous `mediamtx.yml`:
Consult the original `mediamtx.yml` for additional configuration parameters that let you select the image size, the camera mode, the bitrate and so on - just search for `rpi`.
137
166
167
+
==== Customised image streams with WebRTC
138
168
139
-
If the client is not a Raspberry Pi it may have different GStreamer elements available. On an x86 device running Linux, you might run the following command instead:
169
+
MediaMTX is great if you want to stream just the camera images. But what if we want to add some extra information or overlay, or do some extra processing on the images?
170
+
171
+
Before starting, ensure that you've built a version of `rpicam-apps` that includes OpenCV support. Check it by running
and looking for the overlaid text information at the top of the image.
145
178
146
-
==== `libcamerasrc` GStreamer element
179
+
Next, paste the following into your `mediamtx.yml` file:
180
+
----
181
+
paths:
182
+
cam:
183
+
source: udp://127.0.0.1:1234
184
+
----
147
185
148
-
`libcamera` provides a `libcamerasrc` GStreamer element which can be used directly instead of `rpicam-vid`. To use this element, run the following command on the server, replacing the `<ip-addr>` placeholder with the IP address of the client or multicast address and replacing the `<port>` placeholder with the port you would like to use for streaming:
186
+
Now, start `mediamtx` and then, if you're using a Pi 5, in a new terminal window, enter:
(On a Pi 4 or earlier device, leave out the `--libav-video-codec-opts "profile=baseline"` part of the command.)
154
193
155
-
and on the client we use the same playback pipeline as previously.
194
+
On another computer, you can now visit the same address as before, namely `http://<ip-addr-of-pi>:8889/cam`.
156
195
196
+
The reason for specifying "baseline" profile on a Pi 5 is that MediaMTX doesn't support B frames, so we need to stop the encoder from producing them. On earlier devices, with hardware encoders, B frames are never generated so there is no issue. On a Pi 5 you could alternatively remove this option and replace it with `--low-latency` which will also prevent B frames, and produce a (slightly less well compressed) stream with reduced latency.
197
+
198
+
[NOTE]
199
+
====
200
+
If you notice occasional pauses in the video stream, this may be because the UDP receive buffers on the Pi (passing data from `rpicam-vid` to MediaMTX) are too small. To increase them permantently, add
201
+
----
202
+
net.core.rmem_default=1000000
203
+
net.core.rmem_max=1000000
204
+
----
205
+
to your `/etc/sysctl.conf` file (and reboot or run `sudo sysctl -p`).
0 commit comments