Yes, decimation is a standard technique to reduce the CPU cost. At the moment, I'm not fussed about it as the difference in CPU is ('only') in the order of 20 or 30% on my system. I've got plenty of CPU to handle this. Decimation will also reduce the accuracy somewhat.SyRenity wrote:Hi.
Indeed, I even posted about this a year ago here in forums. His 3rd method (http://www.codeproject.com/KB/audio-vid ... tion/3.jpg about half page) said to bring great detection quality, while significantly lowering CPU usage.
What he does, is basically pixelating the image before comparing it, thus lowering the number of pixels needed to be checked, and subsequently lowering the required CPU.
Mor, I wonder whether you can use it with your algorithm, and this way solve the increased CPU usage you mentioned.
(Btw, how exactly higher this usage then standard ZM method?)
Having said that, the next step is to use a median filter to further reduce image noise which is significantly more expensive which is where decimation will become needed. (this is akin to the 'Erosion' filter that's mentioned in the article. Erosion is a significantly more expensive operation though).
It would probably be worth just adding decimation as a setting (2x, 4x or 8x decimation). I have a look at this when I start doing code optimization.