|
8 | 8 |
|
9 | 9 | To list all the cameras available on your platform, use the xref:camera_software.adoc#list-cameras[`list-cameras`] option. To choose which camera to use, pass the camera index to the xref:camera_software.adoc#camera[`camera`] option.
|
10 | 10 |
|
11 |
| -NOTE: `libcamera` does not yet provide stereoscopic camera support. When running two cameras simultaneously, they must be run in separate processes. This means there is no way to synchronise sensor framing or 3A operation between them. As a workaround, you could synchronise the cameras through an external sync signal for the HQ (IMX477) camera, and switch the 3A to manual mode if necessary. |
| 11 | +NOTE: `libcamera` does not yet provide stereoscopic camera support. When running two cameras simultaneously, they must be run in separate processes. This means there is no way to fully synchronise sensor framing or 3A operation between them. As a workaround, you could synchronise the cameras through an external sync signal for the HQ (IMX477) camera, and switch the 3A to manual mode if necessary, or you could try the software camera synchronisation support that is described below. |
| 12 | + |
| 13 | +==== Software Camera Synchronisation |
| 14 | + |
| 15 | +Raspberry Pi's _libcamera_ implementation has the ability to synchronise the frames of different cameras using only software. This will cause one camera to adjust it's frame timing so as to coincide as closely as possible with the frames of another camera. No soldering or hardware connections are required, and it will work with all Raspberry Pi's camera modules, and even third party ones so long as their drivers implement frame duration control correctly. |
| 16 | + |
| 17 | +**How it works** |
| 18 | + |
| 19 | +The scheme works by designating one camera to be the _server_. The server will broadcast timing messages onto the network at regular intervals, such as once a second. Meanwhile other cameras, known as _clients_, can listen to these messages whereupon they may lengthen or shorten frame times slightly so as to pull them into sync with the server. This process is continual, though after the first adjustment, subsequent adjustments are normally small. |
| 20 | + |
| 21 | +The client cameras may be attached to the same Raspberry Pi device as the server, or they may be attached to different Raspberry Pis on the same network. The model of camera on the clients may match the server, or they may be different. |
| 22 | + |
| 23 | +Clients and servers need to be set running at the same nominal framerate (such as 30fps). Note that there is no back-channel from the clients back to the server. It is solely the clients' responsibility to be up and running in time to match the server, and the server is completely unaware whether clients have synchronised successfully, or indeed whether there are any clients at all. |
| 24 | + |
| 25 | +In normal operation, running the same make of camera on the same Raspberry Pi, we would expect the frame start times of the camera images to match within "several tens of microseconds". When the camera models are different this could be significantly larger as the cameras will probably not be able to match framerates exactly and will therefore be continually drifting apart (and brought back together with every timing message). |
| 26 | + |
| 27 | +When cameras are on different devices, the system clocks should be synchronised using NTP (normally the case by default for Raspberry Pi OS), or if this is insufficiently precise, another protocol like PTP might be used. Any discrepancy between system clocks will feed directly into extra error in frame start times - even though the advertised timestamps on the frames will not tell you. |
| 28 | + |
| 29 | +**The Server** |
| 30 | + |
| 31 | +The server, as previously explained, broadcasts timing messages onto the network, by default every second. The server will run for a fixed number of frames, by default 100, after which it will inform the camera application on the device that the "sychronisation point" has been reached. At this moment, the application will start using the frames, so in the case of `rpicam-vid`, they will start being encoded and recorded. Recall that the behaviour and even existence of clients has no bearing on this. |
| 32 | + |
| 33 | +If required, there can be several servers on the same network so long as they are broadcasting timing messages to different network addresses. Clients, of course, will have to be configured to listen for the correct address. |
| 34 | + |
| 35 | +**Clients** |
| 36 | + |
| 37 | +Clients listen out for server timing messages and, when they receive one, will shorten or lengthen a camera frame by the required amount so that subsequent frames will start, as far as possible, at the same moment as the server's. |
| 38 | + |
| 39 | +The clients learn the correct "synchronisation point" from the server's messages, and just like the server, will signal the camera application at the same moment that it should start using the frames. So in the case of `rpicam-vid`, this is once again the moment at which frames will start being recorded. |
| 40 | + |
| 41 | +Normally it makes sense to start clients _before_ the server, as the clients will simply wait (the "syncrhonisation point" has not been reached) until a server is seen broadcasting onto the network. This obviously avoids timing problems where a server might reach its "synchronisation point" even before all the clients have been started! |
| 42 | + |
| 43 | +**Usage in `rpicam-vid`** |
| 44 | + |
| 45 | +We can use software camera synchronisation with `rpicam-vid` to record videos that are sychronised frame-by-frame. We're going to assume we have two cameras attached, and we're going to use camera 0 as the server, and camera 1 as the client. `rpicam-vid` defaults to a fixed 30 frames per second, which will be fine for us. |
| 46 | + |
| 47 | +First we should start the client: |
| 48 | +[source,console] |
| 49 | +---- |
| 50 | +$ rpicam-vid -n -t 20s --camera 1 --codec libav -o client.mp4 --sync client |
| 51 | +---- |
| 52 | + |
| 53 | +Note the `--sync client` parameter. This will record for 20 seconds in total but note that this _includes_ the time to start the server and achieve synchronisation. So while the start of the recordings, and all the frames, will be synchronised, the end of the recordings is not. |
| 54 | + |
| 55 | +To start the server: |
| 56 | +[source,console] |
| 57 | +---- |
| 58 | +$ rpicam-vid -n -t 20s --camera 0 --codec libav -o server.mp4 --sync sync |
| 59 | +---- |
| 60 | + |
| 61 | +This will run for 20 seconds but with the default settings (100 frames at 30fps) will give clients just over 3 seconds to get synchronised before anything is recorded. So the final video file will contain slightly under 17 seconds of video. |
| 62 | + |
| 63 | +The server's broadcast address and port, the frequency of the timing messages and the number of frames to wait for clients to synchronise, can all be changed in the camera tuning file. Clients only pay attention to the broadcast address here which should match the server's; the other information will be ignored. Please refer to the https://datasheets.raspberrypi.com/camera/raspberry-pi-camera-guide.pdf[Raspberry Pi Camera tuning guide] for more information. |
| 64 | + |
| 65 | +In practical operation there are a few final points to be aware of: |
| 66 | + |
| 67 | +* The fixed framerate needs to be below the maximum framerate at which the camera can operate (in the camera mode that is being used). This is because the synchronisation algorithm may need to _shorten_ camera frames so that clients can catch up with the server, and this will fail if it is already running as fast as it can. |
| 68 | +* Whilst cameras frames should be correctly synchronised, at higher framerates, or depending on system load, it is possible for frames, either on the clients or server, to be dropped. In these cases the frame timestamps will help an application to work out what has happened, though it's usually easier simply to try and avoid frame drops - perhaps by lowering the framerate, increasing the number of buffers being allocated to the camera queues, or reducing system load (see the xref:camera_software.adoc#buffer-count[`--buffer-count` option].) |
0 commit comments