-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Explain the new software camera synchronisation feature #4019
base: develop
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very minor comments. Looks good otherwise!
@@ -8,4 +8,61 @@ | |||
|
|||
To list all the cameras available on your platform, use the xref:camera_software.adoc#list-cameras[`list-cameras`] option. To choose which camera to use, pass the camera index to the xref:camera_software.adoc#camera[`camera`] option. | |||
|
|||
NOTE: `libcamera` does not yet provide stereoscopic camera support. When running two cameras simultaneously, they must be run in separate processes. This means there is no way to synchronise sensor framing or 3A operation between them. As a workaround, you could synchronise the cameras through an external sync signal for the HQ (IMX477) camera, and switch the 3A to manual mode if necessary. | |||
NOTE: `libcamera` does not yet provide stereoscopic camera support. When running two cameras simultaneously, they must be run in separate processes. This means there is no way to fully synchronise sensor framing or 3A operation between them. As a workaround, you could synchronise the cameras through an external sync signal for the HQ (IMX477) camera, and switch the 3A to manual mode if necessary, or you could try the software camera synchronisation support that is described below. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no way to fully synchronise sensor framing
Maybe this needs to be omitted from the sentence above?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, will rephrase slightly!
|
||
The scheme works by designating one camera to be the _server_. The server will broadcast timing messages onto the network at regular intervals, such as once a second. Meanwhile other cameras, known as _clients_, can listen to these messages whereupon they may lengthen or shorten frame times slightly so as to pull them into sync with the server. This process is continual, though after the first adjustment, subsequent adjustments are normally small. | ||
|
||
The client cameras may be attached to the same Raspberry Pi device as the server, or they may be attached to different Raspberry Pis on the same network. The model of camera on the clients may match the server, or they may be different. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor nitpick:
s/model of camera/camera model/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep.
|
||
**Clients** | ||
|
||
Clients listen out for server timing messages and, when they receive one, will shorten or lengthen a camera frame by the required amount so that subsequent frames will start, as far as possible, at the same moment as the server's. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/camera frame/camera frame duration/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All done!
@nathan-contino are we ok to merge this update? |
Also add related options, plus a few more options that seem to have been undocumented for a while.
f89d3c2
to
8c05c55
Compare
|
||
==== Software Camera Synchronisation | ||
|
||
Raspberry Pi's _libcamera_ implementation has the ability to synchronise the frames of different cameras using only software. This will cause one camera to adjust it's frame timing so as to coincide as closely as possible with the frames of another camera. No soldering or hardware connections are required, and it will work with all Raspberry Pi's camera modules, and even third party ones so long as their drivers implement frame duration control correctly. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
minor nitpick: "work with all" -> "work with all of" ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes.
Taking a look at this today, will make some minor copy edits and merge. Looks solid so far but I want to make sure everything complies with the style guide. |
|
||
The clients learn the correct "synchronisation point" from the server's messages, and just like the server, will signal the camera application at the same moment that it should start using the frames. So in the case of `rpicam-vid`, this is once again the moment at which frames will start being recorded. | ||
|
||
Normally it makes sense to start clients _before_ the server, as the clients will simply wait (the "syncrhonisation point" has not been reached) until a server is seen broadcasting onto the network. This obviously avoids timing problems where a server might reach its "synchronisation point" even before all the clients have been started! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"syncrhonisation" typo
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
$ rpicam-vid -n -t 20s --camera 0 --codec libav -o server.mp4 --sync server | ||
---- | ||
|
||
This will run for 20 seconds but with the default settings (100 frames at 30fps) will give clients just over 3 seconds to get synchronised before anything is recorded. So the final video file will contain slightly under 17 seconds of video. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"default settings" -> "default synchronisation settings" ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep.
|
||
This will run for 20 seconds but with the default settings (100 frames at 30fps) will give clients just over 3 seconds to get synchronised before anything is recorded. So the final video file will contain slightly under 17 seconds of video. | ||
|
||
The server's broadcast address and port, the frequency of the timing messages and the number of frames to wait for clients to synchronise, can all be changed in the camera tuning file. Clients only pay attention to the broadcast address here which should match the server's; the other information will be ignored. Please refer to the https://datasheets.raspberrypi.com/camera/raspberry-pi-camera-guide.pdf[Raspberry Pi Camera tuning guide] for more information. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm sure that this is probably a stupid question, but does it make sense for these settings to be in the camera tuning file, rather than being command-line arguments?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This kind of thing is always debatable, so it's a fair question. I'm slightly inclined to the view that there's no reason I would ever change this sort of thing once a session is running, so leaving it in the tuning file seems OK. You could, I suppose, argue that someone might prefer to tweak this stuff when they start the camera, and then start everything running. But they have to have some external means of knowing what these numbers should be, so why not put them in the tuning file? Though on the other hand... Argh!
I think I'll probably leave things as they are unless we can think of a clear reason to change things, just because that's less work at this point!
In practical operation there are a few final points to be aware of: | ||
|
||
* The fixed framerate needs to be below the maximum framerate at which the camera can operate (in the camera mode that is being used). This is because the synchronisation algorithm may need to _shorten_ camera frames so that clients can catch up with the server, and this will fail if it is already running as fast as it can. | ||
* Whilst cameras frames should be correctly synchronised, at higher framerates, or depending on system load, it is possible for frames, either on the clients or server, to be dropped. In these cases the frame timestamps will help an application to work out what has happened, though it's usually easier simply to try and avoid frame drops - perhaps by lowering the framerate, increasing the number of buffers being allocated to the camera queues, or reducing system load (see the xref:camera_software.adoc#buffer-count[`--buffer-count` option].) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should "cameras frames" be "camera's frames" or "camera frames"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed, "camera frames", I think!
In practical operation there are a few final points to be aware of: | ||
|
||
* The fixed framerate needs to be below the maximum framerate at which the camera can operate (in the camera mode that is being used). This is because the synchronisation algorithm may need to _shorten_ camera frames so that clients can catch up with the server, and this will fail if it is already running as fast as it can. | ||
* Whilst cameras frames should be correctly synchronised, at higher framerates, or depending on system load, it is possible for frames, either on the clients or server, to be dropped. In these cases the frame timestamps will help an application to work out what has happened, though it's usually easier simply to try and avoid frame drops - perhaps by lowering the framerate, increasing the number of buffers being allocated to the camera queues, or reducing system load (see the xref:camera_software.adoc#buffer-count[`--buffer-count` option].) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Whilst cameras frames should be correctly synchronised, at higher framerates, or depending on system load, it is possible for frames, either on the clients or server, to be dropped." - comma overload? 😉
Perhaps "Whilst camera frames should be correctly synchronised, at higher framerates or under high system load, it is possible for frames on either the clients or server to be dropped."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I can de-comma that a bit!
In practical operation there are a few final points to be aware of: | ||
|
||
* The fixed framerate needs to be below the maximum framerate at which the camera can operate (in the camera mode that is being used). This is because the synchronisation algorithm may need to _shorten_ camera frames so that clients can catch up with the server, and this will fail if it is already running as fast as it can. | ||
* Whilst cameras frames should be correctly synchronised, at higher framerates, or depending on system load, it is possible for frames, either on the clients or server, to be dropped. In these cases the frame timestamps will help an application to work out what has happened, though it's usually easier simply to try and avoid frame drops - perhaps by lowering the framerate, increasing the number of buffers being allocated to the camera queues, or reducing system load (see the xref:camera_software.adoc#buffer-count[`--buffer-count` option].) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"usually easier simply to try" -> "usually easier to try" ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, will re-word a bit. Maybe "usually simpler to try".
@@ -38,6 +38,8 @@ Raspberry Pi OS recognises the following overlays in `/boot/firmware/config.txt` | |||
|
|||
To use one of these overlays, you must disable automatic camera detection. To disable automatic detection, set `camera_auto_detect=0` in `/boot/firmware/config.txt`. If `config.txt` already contains a line assigning an `camera_auto_detect` value, change the value to `0`. Reboot your Raspberry Pi with `sudo reboot` to load your changes. | |||
|
|||
If your Raspberry Pi has two camera connectors (Raspberry Pi 5 or CM4, for example), then you can specify which one you are referring to by adding `,cam0` or `,cam1` (don't add any spaces) to the `dtoverlay` that you used from the table above. If you do not add either of these, it will default to checking camera connector 1 (`cam1`). But note that for official Raspberry PI camera modules, auto-detection will correctly identify all the cameras connected to your device. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"PI" -> "Pi"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks.
@@ -38,6 +38,8 @@ Raspberry Pi OS recognises the following overlays in `/boot/firmware/config.txt` | |||
|
|||
To use one of these overlays, you must disable automatic camera detection. To disable automatic detection, set `camera_auto_detect=0` in `/boot/firmware/config.txt`. If `config.txt` already contains a line assigning an `camera_auto_detect` value, change the value to `0`. Reboot your Raspberry Pi with `sudo reboot` to load your changes. | |||
|
|||
If your Raspberry Pi has two camera connectors (Raspberry Pi 5 or CM4, for example), then you can specify which one you are referring to by adding `,cam0` or `,cam1` (don't add any spaces) to the `dtoverlay` that you used from the table above. If you do not add either of these, it will default to checking camera connector 1 (`cam1`). But note that for official Raspberry PI camera modules, auto-detection will correctly identify all the cameras connected to your device. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that all (most?) models of Compute Module include two camera ports, so I guess you could replace "or CM4" with "or one of the Compute Modules" ? 🤷
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will do.
==== `buffer-count` | ||
The number of buffers to allocate for still image capture or for video recording. The default value of zero lets each application choose a value for itself (1 for still image capture, and 6 for video recording). Increasing the number can sometimes help to reduce the number of frame drops, particularly at higher framerates. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"choose a value for itself" -> "use it's default value" ? Or perhaps having "default value" twice in the same sentence is too confusing? @nathan-contino
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have gone with "The default value of zero lets each application choose a reasonable number for its own use case..."
---- | ||
|
||
[WARNING] | ||
==== | ||
Older versions of vlc used to play H.264 files correctly, but recent versions do not - displaying only a few, or possibly garbled, frames. You should either use a different media player, or save your files in a more widely supported container format - such as MP4 (see below). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"used to play" -> "could play" ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will change. Maybe "were able to play".
As noted previously, `vlc` no longer handles unencapsulated h264 streams. | ||
|
||
Alternatively, use the following command on a client to stream using `ffplay`: | ||
In fact, support for unencapsulated h264 can generally be quite poor so it is often better to send an MPEG-2 Transport Stream instead. Making use of `libav`, this can be accomplished with: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should the use of "h264" here actually be "H.264"? Or does it not matter?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
h264 or H264 or h.264 or H.264? I think folks understand they're all the same, but I think H.264 is probably most common, so will use that.
https://gstreamer.freedesktop.org/[GStreamer] is a Linux framework for reading, processing and playing multimedia files. This section shows how to use `rpicam-vid` to stream video over a network. | ||
https://gstreamer.freedesktop.org/[GStreamer] is a Linux framework for reading, processing and playing multimedia files. We can also use it in conjunction with `rpicam-vid` for network streaming. | ||
|
||
This setup uses `rpicam-vid` to output an encoded h.264 bitstream to stdout. As we've done previously, we're going to encapsulate this in an MPEG-2 Transport Stream for better downstream compatibility. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the example below, it looks like rpicam-vid
is actually outputting MPEG-2 Transport Stream to stdout?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not wrong, in that it is outputting an H.264 bitstream, though it's also been encapsulated (as explained in the next sentence). I'll run the two together, maybe:
"This setup uses rpicam-vid
to output an H.264 bitstream to stdout, though as we've done previously, we're going to encapsulate it in an MPEG-2 Transport Stream for better downstream compatibility."
I'm just going to convert this to "draft" for a bit. I'd like to fix the timer issue with rpicam-vid synchronisation (that the timer doesn't count from when the sync happens) first, then I'll update the PR again. |
Also some other minor corrections.
All done now. |
I've also added some other rather overdue updates, notably to the streaming section.
Also a warning banner at the top that this has nothing to do with the legacy stack. I still get folks complaining about this, all these years later...
@naushir You might want to read through the camera sync stuff too!