Is it possible to generate video containing multiple Events?
Is it possible to generate video containing multiple Events?
If I have have a number of events that occurred in close succession one right after another is it possible to generate a single video from all those events? I haven't been able to determine how to do that if there is a way to do it.
And without having duplicate frames? I've noticed that the last second or so of one event has the same frames and timestamps as the first second or so of the next event right after it. I realize that this overlap is due to Pre and Post Event Image Count settings. So when creating the video made up of multiple events I would not want to see the last second or so of the first event then see that same second or so again because it is the beginning of the second event.
So, is this possible?
And without having duplicate frames? I've noticed that the last second or so of one event has the same frames and timestamps as the first second or so of the next event right after it. I realize that this overlap is due to Pre and Post Event Image Count settings. So when creating the video made up of multiple events I would not want to see the last second or so of the first event then see that same second or so again because it is the beginning of the second event.
So, is this possible?
Are you asking is there a way built in to ZM to already do this, or are you looking for something custom?
I don't believe there's a way to do this in ZM natively. If I were to do something like this, a simple bash script could probably do it. All events are stored as captured frames. So you call a script 'joinevent <path> <path> <ignore>', and it takes 'n1 - <ignore>' from your first event path (where n1 = number of frames in event 1), takes n2 frames from event 2 and renames them keep going in sequence after event 1, and then use ffmpeg to convert with 'ffmpeg -i %03d-capture.jpg ...'
That description probably has more lines than the actual script would have. Just one way to do it.
I don't believe there's a way to do this in ZM natively. If I were to do something like this, a simple bash script could probably do it. All events are stored as captured frames. So you call a script 'joinevent <path> <path> <ignore>', and it takes 'n1 - <ignore>' from your first event path (where n1 = number of frames in event 1), takes n2 frames from event 2 and renames them keep going in sequence after event 1, and then use ffmpeg to convert with 'ffmpeg -i %03d-capture.jpg ...'
That description probably has more lines than the actual script would have. Just one way to do it.
eyeZm, Native iPhone App for ZoneMinder: http://www.eyezm.com
Version 1.3 now available on iTunes, introduces Montage view, HTTPS/SSL support and H264 native streaming!
Subscribe to RSS feed for updates: http://www.eyezm.com/rssfeed.xml
Version 1.3 now available on iTunes, introduces Montage view, HTTPS/SSL support and H264 native streaming!
Subscribe to RSS feed for updates: http://www.eyezm.com/rssfeed.xml
Yes I was asking natively, and I went down that path, writing a Perl script.
But in your example, if I understand it correctly, you would tell it how many images to skip at the end of event 1 or beginning of event 2 so as to not have duplicates. But how do you know how many that would be? If might be 10, it might be 25, it might be 100. The individual frames do not have milliseconds as part of their time stamp. There is the delta in the DB but once again the start time is not recorded with milliseconds so that make the delta moot because you don't know if the event started at 19:31:10.120 or 19:31:10.323 or 19:31:10.677 or 19:31:10.999 etc.
So I don't want to be watching the combined events and periodically see the action jump back a second or two or three and repeat the same action because frames that are at the end of one event are also at the beginning of the next event.
That is the issue, efficiently filtering out frames that may be at the end of one event but also at the beginning of the next event. And because of my Pre and Post Event Image Counts that could be up to 120 frames.
But in your example, if I understand it correctly, you would tell it how many images to skip at the end of event 1 or beginning of event 2 so as to not have duplicates. But how do you know how many that would be? If might be 10, it might be 25, it might be 100. The individual frames do not have milliseconds as part of their time stamp. There is the delta in the DB but once again the start time is not recorded with milliseconds so that make the delta moot because you don't know if the event started at 19:31:10.120 or 19:31:10.323 or 19:31:10.677 or 19:31:10.999 etc.
So I don't want to be watching the combined events and periodically see the action jump back a second or two or three and repeat the same action because frames that are at the end of one event are also at the beginning of the next event.
That is the issue, efficiently filtering out frames that may be at the end of one event but also at the beginning of the next event. And because of my Pre and Post Event Image Counts that could be up to 120 frames.
Bottom line, if you are looking for that level of precision, I don't see how you are going to get it since the frames aren't timestamped anywhere that's accessible to you. Like you said, you can obtain the pre/post counts from the DB, but that would still lead to some gaps and/or skips I'm sure.
Shouldn't be hard to patch ZM to store the exact timestamp in the DB when it writes the frame information there... and that's not a bad idea to begin with...
Shouldn't be hard to patch ZM to store the exact timestamp in the DB when it writes the frame information there... and that's not a bad idea to begin with...
eyeZm, Native iPhone App for ZoneMinder: http://www.eyezm.com
Version 1.3 now available on iTunes, introduces Montage view, HTTPS/SSL support and H264 native streaming!
Subscribe to RSS feed for updates: http://www.eyezm.com/rssfeed.xml
Version 1.3 now available on iTunes, introduces Montage view, HTTPS/SSL support and H264 native streaming!
Subscribe to RSS feed for updates: http://www.eyezm.com/rssfeed.xml
Ok, wrote my own Perl script to do this. Pulling info from the DB and making sure frames aren't duplicated from the end of one event to the beginning of the next event when concatenating them.
Works pretty slick. Soon as I can write up a little operating explanation I'll post it.
Works pretty slick. Soon as I can write up a little operating explanation I'll post it.
Last edited by BlankMan on Thu Dec 02, 2010 6:45 am, edited 1 time in total.
What did you end up doing to ensure no duplication?BlankMan wrote:Ok, wrote my own Perl script to do this. Pulling info from the DB and making sure frames aren't duplicated from from the end of one event to the beginning of the next event when concatenating them.
Works pretty slick. Soon as I can write up a little operating explanation I'll post it.
eyeZm, Native iPhone App for ZoneMinder: http://www.eyezm.com
Version 1.3 now available on iTunes, introduces Montage view, HTTPS/SSL support and H264 native streaming!
Subscribe to RSS feed for updates: http://www.eyezm.com/rssfeed.xml
Version 1.3 now available on iTunes, introduces Montage view, HTTPS/SSL support and H264 native streaming!
Subscribe to RSS feed for updates: http://www.eyezm.com/rssfeed.xml
I track all images by their time stamp. So if I see an image that has a time stamp that I have processed previous images for that date/time I compare that current image to those previously processed images and if I find a match I don't use it.
Brute force I know but best I could do under the circumstances.
Here's the output of a concatenation:
You can see when the Used frame count is less then the actual Frame count for that event there was overlap that has been removed.
Brute force I know but best I could do under the circumstances.
Here's the output of a concatenation:
Code: Select all
Event 81331 Frames/Used: 1186/1186 Seconds: 54
Event 81332 Frames/Used: 707/698 Seconds: 36
Event 81349 Frames/Used: 490/490 Seconds: 29
Event 81350 Frames/Used: 514/514 Seconds: 26
Event 81351 Frames/Used: 1931/1837 Seconds: 102
Good work! I wish more of these kinds of contributions were integrated into ZM itself so others can benefit easier instead of having to download each contribution separately... maybe I'll look into integrating some of these myself.
eyeZm, Native iPhone App for ZoneMinder: http://www.eyezm.com
Version 1.3 now available on iTunes, introduces Montage view, HTTPS/SSL support and H264 native streaming!
Subscribe to RSS feed for updates: http://www.eyezm.com/rssfeed.xml
Version 1.3 now available on iTunes, introduces Montage view, HTTPS/SSL support and H264 native streaming!
Subscribe to RSS feed for updates: http://www.eyezm.com/rssfeed.xml
Thanks. If you go through the trouble of integrating it you might want to add the milliseconds to each images time stamp in the db, that make this process a heck of a lot easier. But that mean tracking that time stamp for all the (Pre) buffered images so you'd have it when needed.jdhar wrote:Good work! I wish more of these kinds of contributions were integrated into ZM itself so others can benefit easier instead of having to download each contribution separately... maybe I'll look into integrating some of these myself.