Viewing a recording using the MP4 and MJPEG codec
Posted: Mon Feb 05, 2024 5:26 pm
Browser: Google Chrome 109.0.5414.168
There is a recording
Bit rate: 16.7 Mb/s
Width: 3840 pixels
Height: 2,160 pixels
Display aspect ratio: 16:9
Frame rate mode : Variable
Frame rate: 15.000 FPS
Minimum frame rate: 7.643 FPS
Maximum frame rate: 105.140 FPS
Produced with the following monitor settings:
Capture Resolution (pixels): 640x360
Video Writer: Camera Passthrough
Stream from camera: 3840x2160, 15Fps, H264 base
When viewing a recording using MJPEG, the "nph-zms" process consumes one Xeon core, which is acceptable in principle, although a bit much. Memory consumption is negligible. The navigation time bar under the video is accessible and I can select any location to play. The video is displayed in 640x360 px quality.
If I start using the MP4 codec, then the processor and memory consumption does not increase at all compared to the moment before starting viewing! The video quality is ideal, i.e. 3840x2160, 15Fps. Fast rewind x5 works perfectly.
I'm shocked !!!
But when using the MP4 codec, although there is a navigation time bar under the video, it is not possible to use it. Those. I can only start watching from the beginning and cannot go to any point, this makes me very sad.
1. Is it possible to fix this somehow? It seems to me that in some version (maybe 1.36) the measured scale in MP4 worked.
2. Since MP4 works so well and MJPEG works satisfactorily when viewing a recording, then why does live viewing work so badly at 8K resolution? Is there any difference in processing the RTSP stream from the camera and processing the stream when reading from the HDD?
There is a recording
Bit rate: 16.7 Mb/s
Width: 3840 pixels
Height: 2,160 pixels
Display aspect ratio: 16:9
Frame rate mode : Variable
Frame rate: 15.000 FPS
Minimum frame rate: 7.643 FPS
Maximum frame rate: 105.140 FPS
Produced with the following monitor settings:
Capture Resolution (pixels): 640x360
Video Writer: Camera Passthrough
Stream from camera: 3840x2160, 15Fps, H264 base
When viewing a recording using MJPEG, the "nph-zms" process consumes one Xeon core, which is acceptable in principle, although a bit much. Memory consumption is negligible. The navigation time bar under the video is accessible and I can select any location to play. The video is displayed in 640x360 px quality.
If I start using the MP4 codec, then the processor and memory consumption does not increase at all compared to the moment before starting viewing! The video quality is ideal, i.e. 3840x2160, 15Fps. Fast rewind x5 works perfectly.
I'm shocked !!!
But when using the MP4 codec, although there is a navigation time bar under the video, it is not possible to use it. Those. I can only start watching from the beginning and cannot go to any point, this makes me very sad.
1. Is it possible to fix this somehow? It seems to me that in some version (maybe 1.36) the measured scale in MP4 worked.
2. Since MP4 works so well and MJPEG works satisfactorily when viewing a recording, then why does live viewing work so badly at 8K resolution? Is there any difference in processing the RTSP stream from the camera and processing the stream when reading from the HDD?