Video · Multi-camera
Connect any number of cameras.
Five transports behind one service API. Two SoC families. Zero-copy frame bus contract — every backend converges on the same buffer shape, so downstream consumers do not negotiate format per source.
How it works
One capture micro service, five transports.
| Transport | SoC backend | Notes |
|---|---|---|
| MIPI-CSI | iMX8M Plus (libcamera) | Native MIPI; no external bridge |
| GMSL2 | QCS family (QMMF) | Qualcomm-only path |
| USB UVC | Both | Standard V4L2 UVC class driver |
| Ethernet RTSP / ONVIF | Both | Network cameras; RTSPS via TLS |
| WebRTC (WHEP) | Both | Browser-compatible signalling |
Multiple sensors feed their per-platform backends; backends converge on a tee element with the zero-copy frame bus contract; the tee fans out to encoder, NPU, GPU ROI shader, and pose tracker.
flowchart LR S1[Sensor 1] --> B1[Backend<br/>QMMF or libcamera] S2[Sensor 2] --> B2[Backend] S3[Sensor N] --> B3[Backend] B1 --> T[Tee · zero-copy DMABuf] B2 --> T B3 --> T T --> E[Encoder] T --> N[NPU] T --> G[GPU ROI shader] T --> V[Pose tracker]
Architecture
What you do not write — the infrastructure MOS4 ships.
mos-camera-capture GStreamer pipeline end-to-end across five transports and two SoC families. Backend selected at runtime via device-tree probe.
NV12 dmabuf entry
Every backend normalises to video/x-raw(memory:DMABuf), format=NV12. CPU
pixel mapping is a spec violation in production builds.
mos-frame Cross-process SHM fanout via Linux dmabuf fd over SCM_RIGHTS. LATEST-N drop policy per subscriber — slow consumer never stalls the producer.
CaptureControlService ListDevices · ConfigureStream · StartCapture · StopCapture · GetStatus · ResetCapture — one surface for every platform.
FAQ
Frequently asked questions
-
How many cameras at once?
There is no fixed cap on cameras — the bound is frame-bus bandwidth on the bus and the producer count the SoC silicon supports. Production deployments routinely run dual-optic dashcams; multi-camera surveillance arrays of four or more are supported on AI-class silicon.
-
Can I mix MIPI-CSI and IP cameras on the same box?
Yes. The zero-copy frame bus contract normalises every backend to the same buffer shape before the tee element. Downstream consumers (encoder, NPU, GPU crop) do not negotiate format per source.
-
What if I add a USB camera at runtime?
mos-camera-capture supports hot-plug for UVC and handles the GStreamer pipeline lifecycle end-to-end — start, stop, error recovery, hot-plug — without integration code.
-
Do I need to write a per-platform GStreamer pipeline?
No. The capture-control service (ListDevices, ConfigureStream, StartCapture, StopCapture, GetStatus, ResetCapture) is the integration surface. Backend selection happens at runtime via device-tree probe; no recompile per hardware variant.
Bring your cameras.
MIPI / GMSL / UVC / RTSP / WebRTC — pick your transport mix and we will sketch the capture topology.