diff --git a/documentation/asciidoc/computers/camera/camera_usage.adoc b/documentation/asciidoc/computers/camera/camera_usage.adoc index 6c1382952..722f37c82 100644 --- a/documentation/asciidoc/computers/camera/camera_usage.adoc +++ b/documentation/asciidoc/computers/camera/camera_usage.adoc @@ -12,3 +12,8 @@ Raspberry Pi produces several official camera modules, including: For more information about camera hardware, see the xref:../accessories/camera.adoc#about-the-camera-modules[camera hardware documentation]. First, xref:../accessories/camera.adoc#install-a-raspberry-pi-camera[install your camera module]. Then, follow the guides in this section to put your camera module to use. + +[WARNING] +==== +This guide no longer covers the _legacy camera stack_ which was available in Bullseye and earlier Raspberry Pi OS releases. The legacy camera stack, using applications like `raspivid`, `raspistill` and the original `Picamera` (_not_ `Picamera2`) Python library, has been deprecated for many years, and is now unsupported. If you are using the legacy camera stack, it will only have support for the Camera Module 1, Camera Module 2 and the High Quality Camera, and will never support any newer camera modules. Nothing in this document is applicable to the legacy camera stack. +==== diff --git a/documentation/asciidoc/computers/camera/rpicam_apps_multicam.adoc b/documentation/asciidoc/computers/camera/rpicam_apps_multicam.adoc index 92c0891bf..fb387443a 100644 --- a/documentation/asciidoc/computers/camera/rpicam_apps_multicam.adoc +++ b/documentation/asciidoc/computers/camera/rpicam_apps_multicam.adoc @@ -8,4 +8,61 @@ To list all the cameras available on your platform, use the xref:camera_software.adoc#list-cameras[`list-cameras`] option. To choose which camera to use, pass the camera index to the xref:camera_software.adoc#camera[`camera`] option. -NOTE: `libcamera` does not yet provide stereoscopic camera support. When running two cameras simultaneously, they must be run in separate processes. This means there is no way to synchronise sensor framing or 3A operation between them. As a workaround, you could synchronise the cameras through an external sync signal for the HQ (IMX477) camera, and switch the 3A to manual mode if necessary. +NOTE: `libcamera` does not yet provide stereoscopic camera support. When running two cameras simultaneously, they must be run in separate processes, meaning there is no way to synchronise 3A operation between them. As a workaround, you could synchronise the cameras through an external sync signal for the HQ (IMX477) camera or use the software camera synchronisation support that is described below, switching the 3A to manual mode if necessary. + +==== Software Camera Synchronisation + +Raspberry Pi's _libcamera_ implementation has the ability to synchronise the frames of different cameras using only software. This will cause one camera to adjust it's frame timing so as to coincide as closely as possible with the frames of another camera. No soldering or hardware connections are required, and it will work with all of Raspberry Pi's camera modules, and even third party ones so long as their drivers implement frame duration control correctly. + +**How it works** + +The scheme works by designating one camera to be the _server_. The server will broadcast timing messages onto the network at regular intervals, such as once a second. Meanwhile other cameras, known as _clients_, can listen to these messages whereupon they may lengthen or shorten frame times slightly so as to pull them into sync with the server. This process is continual, though after the first adjustment, subsequent adjustments are normally small. + +The client cameras may be attached to the same Raspberry Pi device as the server, or they may be attached to different Raspberry Pis on the same network. The camera model on the clients may match the server, or they may be different. + +Clients and servers need to be set running at the same nominal framerate (such as 30fps). Note that there is no back-channel from the clients back to the server. It is solely the clients' responsibility to be up and running in time to match the server, and the server is completely unaware whether clients have synchronised successfully, or indeed whether there are any clients at all. + +In normal operation, running the same make of camera on the same Raspberry Pi, we would expect the frame start times of the camera images to match within "several tens of microseconds". When the camera models are different this could be significantly larger as the cameras will probably not be able to match framerates exactly and will therefore be continually drifting apart (and brought back together with every timing message). + +When cameras are on different devices, the system clocks should be synchronised using NTP (normally the case by default for Raspberry Pi OS), or if this is insufficiently precise, another protocol like PTP might be used. Any discrepancy between system clocks will feed directly into extra error in frame start times - even though the advertised timestamps on the frames will not tell you. + +**The Server** + +The server, as previously explained, broadcasts timing messages onto the network, by default every second. The server will run for a fixed number of frames, by default 100, after which it will inform the camera application on the device that the "synchronisation point" has been reached. At this moment, the application will start using the frames, so in the case of `rpicam-vid`, they will start being encoded and recorded. Recall that the behaviour and even existence of clients has no bearing on this. + +If required, there can be several servers on the same network so long as they are broadcasting timing messages to different network addresses. Clients, of course, will have to be configured to listen for the correct address. + +**Clients** + +Clients listen out for server timing messages and, when they receive one, will shorten or lengthen a camera frame duration by the required amount so that subsequent frames will start, as far as possible, at the same moment as the server's. + +The clients learn the correct "synchronisation point" from the server's messages, and just like the server, will signal the camera application at the same moment that it should start using the frames. So in the case of `rpicam-vid`, this is once again the moment at which frames will start being recorded. + +Normally it makes sense to start clients _before_ the server, as the clients will simply wait (the "synchronisation point" has not been reached) until a server is seen broadcasting onto the network. This obviously avoids timing problems where a server might reach its "synchronisation point" even before all the clients have been started! + +**Usage in `rpicam-vid`** + +We can use software camera synchronisation with `rpicam-vid` to record videos that are synchronised frame-by-frame. We're going to assume we have two cameras attached, and we're going to use camera 0 as the server, and camera 1 as the client. `rpicam-vid` defaults to a fixed 30 frames per second, which will be fine for us. + +First we should start the client: +[source,console] +---- +$ rpicam-vid -n -t 20s --camera 1 --codec libav -o client.mp4 --sync client +---- + +Note the `--sync client` parameter. This will record for 20 seconds but _only_ once the synchronisation point has been reached. If necessary, it will wait indefinitely for the first server message. + +To start the server: +[source,console] +---- +$ rpicam-vid -n -t 20s --camera 0 --codec libav -o server.mp4 --sync server +---- + +This too will run for 20 seconds counting from when the synchronisation point is reached and the recording starts. With the default synchronisation settings (100 frames at 30fps) this means there will be just over 3 seconds for clients to get synchronised. + +The server's broadcast address and port, the frequency of the timing messages and the number of frames to wait for clients to synchronise, can all be changed in the camera tuning file. Clients only pay attention to the broadcast address here which should match the server's; the other information will be ignored. Please refer to the https://datasheets.raspberrypi.com/camera/raspberry-pi-camera-guide.pdf[Raspberry Pi Camera tuning guide] for more information. + +In practical operation there are a few final points to be aware of: + +* The fixed framerate needs to be below the maximum framerate at which the camera can operate (in the camera mode that is being used). This is because the synchronisation algorithm may need to _shorten_ camera frames so that clients can catch up with the server, and this will fail if it is already running as fast as it can. +* Whilst camera frames should be correctly synchronised, at higher framerates or depending on system load, it is possible for frames, either on the clients or server, to be dropped. In these cases the frame timestamps will help an application to work out what has happened, though it's usually simpler to try and avoid frame drops - perhaps by lowering the framerate, increasing the number of buffers being allocated to the camera queues (see the xref:camera_software.adoc#buffer-count[`--buffer-count` option]), or reducing system load. \ No newline at end of file diff --git a/documentation/asciidoc/computers/camera/rpicam_configuration.adoc b/documentation/asciidoc/computers/camera/rpicam_configuration.adoc index cd6d0f183..3f8651b3a 100644 --- a/documentation/asciidoc/computers/camera/rpicam_configuration.adoc +++ b/documentation/asciidoc/computers/camera/rpicam_configuration.adoc @@ -38,6 +38,8 @@ Raspberry Pi OS recognises the following overlays in `/boot/firmware/config.txt` To use one of these overlays, you must disable automatic camera detection. To disable automatic detection, set `camera_auto_detect=0` in `/boot/firmware/config.txt`. If `config.txt` already contains a line assigning an `camera_auto_detect` value, change the value to `0`. Reboot your Raspberry Pi with `sudo reboot` to load your changes. +If your Raspberry Pi has two camera connectors (Raspberry Pi 5 or one of the Compute Modules, for example), then you can specify which one you are referring to by adding `,cam0` or `,cam1` (don't add any spaces) to the `dtoverlay` that you used from the table above. If you do not add either of these, it will default to checking camera connector 1 (`cam1`). But note that for official Raspberry Pi camera modules, auto-detection will correctly identify all the cameras connected to your device. + [[tuning-files]] ==== Tweak camera behaviour with tuning files diff --git a/documentation/asciidoc/computers/camera/rpicam_options_common.adoc b/documentation/asciidoc/computers/camera/rpicam_options_common.adoc index 21f0974e1..1f9f64b39 100644 --- a/documentation/asciidoc/computers/camera/rpicam_options_common.adoc +++ b/documentation/asciidoc/computers/camera/rpicam_options_common.adoc @@ -89,9 +89,19 @@ Alias: `-t` Default value: 5000 milliseconds (5 seconds) -Specify how long the application runs before closing. This applies to both video recording and preview windows. When capturing a still image, the application shows a preview window for `timeout` milliseconds before capturing the output image. +Specify how long the application runs before closing. This value is interpreted as a number of milliseconds unless an optional suffix is used to change the unit. The suffix may be one of: -To run the application indefinitely, specify a value of `0`. +* `min` - minutes +* `s` or `sec` - seconds +* `ms` - milliseconds (the default if no suffix used) +* `us` - microseconds +* `ns` - nanoseconds. + +This time applies to both video recording and preview windows. When capturing a still image, the application shows a preview window for the length of time specified by the `timeout` parameter before capturing the output image. + +To run the application indefinitely, specify a value of `0`. Floating point values are also permitted. + +Example: `rpicam-hello -t 0.5min` would run for 30 seconds. ==== `preview` @@ -553,3 +563,11 @@ Flushes output files to disk as soon as a frame finishes writing, instead of wai Specifies a JSON file that configures the post-processing applied by the imaging pipeline. This applies to camera images _before_ they reach the application. This works similarly to the legacy `raspicam` "image effects". Accepts a file name path as input. Post-processing is a large topic and admits the use of third-party software like OpenCV and TensorFlowLite to analyse and manipulate images. For more information, see xref:camera_software.adoc#post-processing-with-rpicam-apps[post-processing]. + +==== `buffer-count` + +The number of buffers to allocate for still image capture or for video recording. The default value of zero lets each application choose a reasonable number for its own use case (1 for still image capture, and 6 for video recording). Increasing the number can sometimes help to reduce the number of frame drops, particularly at higher framerates. + +==== `viewfinder-buffer-count` + +As the `buffer-count` option, but applies when running in preview mode (that is `rpicam-hello` or the preview, not capture, phase of `rpicam-still`). diff --git a/documentation/asciidoc/computers/camera/rpicam_options_vid.adoc b/documentation/asciidoc/computers/camera/rpicam_options_vid.adoc index 7a5bb71e9..66f0d8444 100644 --- a/documentation/asciidoc/computers/camera/rpicam_options_vid.adoc +++ b/documentation/asciidoc/computers/camera/rpicam_options_vid.adoc @@ -132,3 +132,10 @@ Records exactly the specified number of frames. Any non-zero value overrides xre Records exactly the specified framerate. Accepts a nonzero integer. +==== `low-latency` + +On a Pi 5, the `--low-latency` option will reduce the encoding latency, which may be beneficial for real-time streaming applications, in return for (slightly) less good coding efficiency (for example, B frames and arithmethic coding will no longer be used). + +==== `sync` + +Run the camera in software synchronisation mode, where multiple cameras synchronise frames to the same moment in time. The `sync` mode can be set to either `client` or `server`. For more information, please refer to the detailed explanation of xref:camera_software.adoc#software-camera-synchronisation[how software synchronisation works]. \ No newline at end of file diff --git a/documentation/asciidoc/computers/camera/rpicam_vid.adoc b/documentation/asciidoc/computers/camera/rpicam_vid.adoc index f4ee22060..e88c5b762 100644 --- a/documentation/asciidoc/computers/camera/rpicam_vid.adoc +++ b/documentation/asciidoc/computers/camera/rpicam_vid.adoc @@ -11,13 +11,18 @@ For example, the following command writes a ten-second video to a file named `te $ rpicam-vid -t 10s -o test.h264 ---- -You can play the resulting file with VLC and other video players: +You can play the resulting file with ffplay and other video players: [source,console] ---- -$ vlc test.h264 +$ ffplay test.h264 ---- +[WARNING] +==== +Older versions of vlc were able to play H.264 files correctly, but recent versions do not - displaying only a few, or possibly garbled, frames. You should either use a different media player, or save your files in a more widely supported container format - such as MP4 (see below). +==== + On Raspberry Pi 5, you can output to the MP4 container format directly by specifying the `mp4` file extension for your output file: [source,console] @@ -25,6 +30,13 @@ On Raspberry Pi 5, you can output to the MP4 container format directly by specif $ rpicam-vid -t 10s -o test.mp4 ---- +On Raspberry Pi 4, or earlier devices, you can save MP4 files using: + +[source,console] +---- +$ rpicam-vid -t 10s --codec libav -o test.mp4 +---- + ==== Encoders `rpicam-vid` supports motion JPEG as well as both uncompressed and unformatted YUV420: @@ -76,3 +88,11 @@ To enable the `libav` backend, pass `libav` to the xref:camera_software.adoc#cod ---- $ rpicam-vid --codec libav --libav-format avi --libav-audio --output example.avi ---- + +==== Low latency video with the Pi 5 + +Pi 5 uses software video encoders. These generally output frames with a longer latency than the old hardware encoders, and this can sometimes be an issue for real-time streaming applications. + +In this case, please add the option `--low-latency` to the `rpicam-vid` command. This will alter certain encoder options to output the encoded frame more quickly. + +The downside is that coding efficiency is (slightly) less good, and that the processor's multiple cores may be used (slightly) less efficiently. The maximum framerate that can be encoded may be slightly reduced (though it will still easily achieve 1080p30). diff --git a/documentation/asciidoc/computers/camera/streaming.adoc b/documentation/asciidoc/computers/camera/streaming.adoc index 0d7e378a3..ffcf9a656 100644 --- a/documentation/asciidoc/computers/camera/streaming.adoc +++ b/documentation/asciidoc/computers/camera/streaming.adoc @@ -1,6 +1,6 @@ == Stream video over a network with `rpicam-apps` -This section describes native streaming from `rpicam-vid`. You can also use the xref:camera_software.adoc#libav-integration-with-rpicam-vid[`libav`] backend for network streaming. +This section describes how to stream video over a network using `rpicam-vid`. Whilst it's possible to stream very simple formats without using `libav`, for most applications we recommend using the xref:camera_software.adoc#libav-integration-with-rpicam-vid[`libav` backend]. === UDP @@ -8,67 +8,75 @@ To stream video over UDP using a Raspberry Pi as a server, use the following com [source,console] ---- -$ rpicam-vid -t 0 --inline -o udp://: +$ rpicam-vid -t 0 -n --inline -o udp://: ---- To view video streamed over UDP using a Raspberry Pi as a client, use the following command, replacing the `` placeholder with the port you would like to stream from: [source,console] ---- -$ vlc udp://@: :demux=h264 +$ ffplay udp://@: -fflags nobuffer -flags low_delay -framedrop ---- +As noted previously, `vlc` no longer handles unencapsulated H.264 streams. -Alternatively, use the following command on a client to stream using `ffplay`: +In fact, support for unencapsulated H.264 can generally be quite poor so it is often better to send an MPEG-2 Transport Stream instead. Making use of `libav`, this can be accomplished with: [source,console] ---- -$ ffplay udp://: -fflags nobuffer -flags low_delay -framedrop +$ rpicam-vid -t 0 -n --codec libav --libav-format mpegts -o udp://: ---- -=== TCP - -You can also stream video over TCP. To use a Raspberry Pi as a server: +In this case, we can also play the stream successfully with `vlc`: [source,console] ---- -$ rpicam-vid -t 0 --inline --listen -o tcp://0.0.0.0: +$ vlc udp://@: ---- -To view video streamed over TCP using a Raspberry Pi as a client, use the following command: +=== TCP + +You can also stream video over TCP. As before, we can send an unencapsulated H.264 stream over the network. To use a Raspberry Pi as a server: [source,console] ---- -$ vlc tcp/h264://: +$ rpicam-vid -t 0 -n --inline --listen -o tcp://0.0.0.0: ---- -Alternatively, use the following command on a client to stream using `ffplay` at 30 frames per second: +To view video streamed over TCP using a Raspberry Pi as a client, assuming the server is running at 30 frames per second, use the following command: [source,console] ---- $ ffplay tcp://: -vf "setpts=N/30" -fflags nobuffer -flags low_delay -framedrop ---- -=== RTSP +But as with the UDP examples, it is often preferable to send an MPEG-2 Transport Stream as this is generally better supported. To do this, use: -To use VLC to stream video over RTSP using a Raspberry Pi as a server, use the following command: +[source,console] +---- +$ rpicam-vid -t 0 -n --codec libav --libav-format mpegts -o tcp://0.0.0.0:?listen=1 +---- + +We can now play this back using a variety of media players, including `vlc`: [source,console] ---- -$ rpicam-vid -t 0 --inline -o - | cvlc stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/stream1}' :demux=h264 +$ vlc tcp://: ---- -For the best performance on Raspberry Pi 5, use the following command instead, which adds libav to force the H264 format: +=== RTSP + +We can use VLC as an RTSP server, however, we must send it an MPEG-2 Transport Stream as it no longer understands unencapsulated H.264: [source,console] ---- -$ rpicam-vid -t 0 --inline --libav-format h264 -o - | cvlc stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/stream1}' :demux=h264 +$ rpicam-vid -t 0 -n --codec libav --libav-format mpegts -o - | cvlc stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/stream1}' ---- To view video streamed over RTSP using a Raspberry Pi as a client, use the following command: [source,console] ---- -$ ffplay rtsp://:8554/stream1 -vf "setpts=N/30" -fflags nobuffer -flags low_delay -framedrop +$ ffplay rtsp://:8554/stream1 -fflags nobuffer -flags low_delay -framedrop ---- Alternatively, use the following command on a client to stream using VLC: @@ -78,14 +86,13 @@ Alternatively, use the following command on a client to stream using VLC: $ vlc rtsp://:8554/stream1 ---- -To suppress the preview window on the server, use xref:camera_software.adoc#nopreview[`nopreview`]. +If you want to see a preview window on the server, just drop the `-n` option (see xref:camera_software.adoc#nopreview[`nopreview`]). -Use the xref:camera_software.adoc#inline[`inline`] flag to force stream header information into every intra frame, which helps clients understand the stream if they miss the beginning. +=== `libav` and Audio -=== `libav` +We have already been using `libav` as the backend for network streaming. `libav` allows us to add an audio stream, so long as we're using a format - like the MPEG-2 Transport Stream - that permits audio data. -You can use the `libav` backend as a network streaming source for audio/video. -To stream video over TCP using a Raspberry Pi as a server, use the following command, replacing the `` placeholder with the IP address of the client or multicast address and replacing the `` placeholder with the port you would like to use for streaming: +We can take one of our previous commands, like the one for streaming an MPEG-2 Transport Stream over TCP, and simply add the `--libav-audio` option: [source,console] ---- @@ -101,56 +108,99 @@ $ rpicam-vid -t 0 --codec libav --libav-format mpegts --libav-audio -o "udp://< === GStreamer -https://gstreamer.freedesktop.org/[GStreamer] is a Linux framework for reading, processing and playing multimedia files. This section shows how to use `rpicam-vid` to stream video over a network. +https://gstreamer.freedesktop.org/[GStreamer] is a Linux framework for reading, processing and playing multimedia files. We can also use it in conjunction with `rpicam-vid` for network streaming. + +This setup uses `rpicam-vid` to output an H.264 bitstream to stdout, though as we've done previously, we're going to encapsulate it in an MPEG-2 Transport Stream for better downstream compatibility. -This setup uses `rpicam-vid` to output an encoded h.264 bitstream to stdout. Then, we use the GStreamer `fdsrc` element to receive the bitstream, and extra GStreamer elements to send it over the network. On the server, run the following command to start the stream, replacing the `` placeholder with the IP address of the client or multicast address and replacing the `` placeholder with the port you would like to use for streaming: +Then, we use the GStreamer `fdsrc` element to receive the bitstream, and extra GStreamer elements to send it over the network. On the server, run the following command to start the stream, replacing the `` placeholder with the IP address of the client or multicast address and replacing the `` placeholder with the port you would like to use for streaming: [source,console] ---- -$ rpicam-vid -t 0 -n --inline -o - | gst-launch-1.0 fdsrc fd=0 ! udpsink host= port= +$ rpicam-vid -t 0 -n --codec libav --libav-format mpegts -o - | gst-launch-1.0 fdsrc fd=0 ! udpsink host= port= ---- -On the client, run the following command to receive the stream, replacing the `` placeholder with the IP address of the client or multicast address and replacing the `` placeholder with the port you would like to use for streaming: +We could of course use anything (such as vlc) as the client, and the best GStreamer clients for playback are beyond the scope of this document. However, we note that the following pipeline (with the obvious substitutions) would work on a Pi 4 or earlier device: [source,console] ---- -$ gst-launch-1.0 udpsrc address= port= ! h264parse ! v4l2h264dec ! autovideosink +$ gst-launch-1.0 udpsrc address= port= ! tsparse ! tsdemux ! h264parse ! queue ! v4l2h264dec ! autovideosink ---- +For a Pi 5, replace `v4l2h264dec` by `avdec_h264`. + TIP: To test this configuration, run the server and client commands in separate terminals on the same device, using `localhost` as the address. -==== RTP +==== `libcamerasrc` GStreamer element -To stream using RTP, run the following command on the server, replacing the `` placeholder with the IP address of the client or multicast address and replacing the `` placeholder with the port you would like to use for streaming: +`libcamera` provides a `libcamerasrc` GStreamer element which can be used directly instead of `rpicam-vid`. To use this element, run the following command on the server, replacing the `` placeholder with the IP address of the client or multicast address and replacing the `` placeholder with the port you would like to use for streaming. On a Pi 4 or earlier device, use: [source,console] ---- -$ rpicam-vid -t 0 -n --inline -o - | gst-launch-1.0 fdsrc fd=0 ! h264parse ! rtph264pay ! udpsink host= port= +$ gst-launch-1.0 libcamerasrc ! capsfilter caps=video/x-raw,width=640,height=360,format=NV12,interlace-mode=progressive ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1" ! 'video/x-h264,level=(string)4' ! h264parse ! mpegtsmux ! udpsink host= port= ---- +On a Pi 5 you would have to replace `v4l2h264enc extra-controls="controls,repeat_sequence_header=1"` by `x264enc speed-preset=1 threads=1`. -To receive over RTP, run the following command on the client, replacing the `` placeholder with the IP address of the client or multicast address and replacing the `` placeholder with the port you would like to use for streaming: +On the client we could use the same playback pipeline as we did just above, or other streaming media players. -[source,console] +=== WebRTC + +Streaming over WebRTC (for example, to web browsers) is best accomplished using third party software. https://github.com/bluenviron/mediamtx[MediaMTX], for example, includes native Raspberry Pi camera support which makes it easy to use. + +To install it, download the latest version from the https://github.com/bluenviron/mediamtx/releases[releases] page. Raspberry Pi OS 64-bit users will want the "linux_arm64v8" compressed tar file (ending `.tar.gz`). Unpack it and you will get a `mediamtx` executable and a configuration file called `mediamtx.yml`. + +It's worth backing up the `mediamtx.yml` file because it documents many Raspberry Pi camera options that you may want to investigate later. + +To stream the camera, replace the contents of `mediamtx.yml` by: +---- +paths: + cam: + source: rpiCamera +---- +and start the `mediamtx` executable. On a browser, enter `http://:8889/cam` into the address bar. + +If you want MediaMTX to acquire the camera only when the stream is requested, add the following line to the previous `mediamtx.yml`: ---- -$ gst-launch-1.0 udpsrc address= port= caps=application/x-rtp ! rtph264depay ! h264parse ! v4l2h264dec ! autovideosink + sourceOnDemand: yes ---- +Consult the original `mediamtx.yml` for additional configuration parameters that let you select the image size, the camera mode, the bitrate and so on - just search for `rpi`. +==== Customised image streams with WebRTC -If the client is not a Raspberry Pi it may have different GStreamer elements available. On an x86 device running Linux, you might run the following command instead: +MediaMTX is great if you want to stream just the camera images. But what if we want to add some extra information or overlay, or do some extra processing on the images? + +Before starting, ensure that you've built a version of `rpicam-apps` that includes OpenCV support. Check it by running [source,console] ---- -$ gst-launch-1.0 udpsrc address= port= caps=application/x-rtp ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink +$ rpicam-hello --post-process-file rpicam-apps/assets/annotate_cv.json ---- +and looking for the overlaid text information at the top of the image. -==== `libcamerasrc` GStreamer element +Next, paste the following into your `mediamtx.yml` file: +---- +paths: + cam: + source: udp://127.0.0.1:1234 +---- -`libcamera` provides a `libcamerasrc` GStreamer element which can be used directly instead of `rpicam-vid`. To use this element, run the following command on the server, replacing the `` placeholder with the IP address of the client or multicast address and replacing the `` placeholder with the port you would like to use for streaming: +Now, start `mediamtx` and then, if you're using a Pi 5, in a new terminal window, enter: [source,console] ---- -$ gst-launch-1.0 libcamerasrc ! capsfilter caps=video/x-raw,width=1280,height=720,format=NV12 ! v4l2convert ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1" ! 'video/x-h264,level=(string)4.1' ! h264parse ! rtph264pay ! udpsink host= port= +$ rpicam-vid -t 0 -n --codec libav --libav-video-codec-opts "profile=baseline" --libav-format mpegts -o udp://127.0.0.1:1234?pkt_size=1316 --post-process-file rpicam-apps/assets/annotate_cv.json ---- +(On a Pi 4 or earlier device, leave out the `--libav-video-codec-opts "profile=baseline"` part of the command.) -and on the client we use the same playback pipeline as previously. +On another computer, you can now visit the same address as before, namely `http://:8889/cam`. +The reason for specifying "baseline" profile on a Pi 5 is that MediaMTX doesn't support B frames, so we need to stop the encoder from producing them. On earlier devices, with hardware encoders, B frames are never generated so there is no issue. On a Pi 5 you could alternatively remove this option and replace it with `--low-latency` which will also prevent B frames, and produce a (slightly less well compressed) stream with reduced latency. + +[NOTE] +==== +If you notice occasional pauses in the video stream, this may be because the UDP receive buffers on the Pi (passing data from `rpicam-vid` to MediaMTX) are too small. To increase them permantently, add +---- +net.core.rmem_default=1000000 +net.core.rmem_max=1000000 +---- +to your `/etc/sysctl.conf` file (and reboot or run `sudo sysctl -p`). +==== \ No newline at end of file