Create New Topic

Geoff Thiel

John your 'guess' that VCA is less robust than VideoIQ is not right. Have a look at this 'side by side shootout' test video on YouTube which shows VideoIQ performing worse in demanding situations.

Many have moved from VideoIQ to VCA based products not just because ofprice but also that VCA is more robust and accurate.

TESTING – VCA has an extensive library of test video with scenes selected to test robustness and accuracy. In this case the foliage movement, rippling water, low contrast and stationary person scenes cause VideoIQ to false-alarm or fail to track, whereas VCA continued to operate correctly. The test method, described at the end of the video, ensured that the testing was equal, fair and repeatable. We have tested quite a few competitor products this way and so far none of them has out-performed VCA on these test clips.

VCA Technology also has i­-LIDS video analytics (sterile zone) certification. Currently, this is the only fully independent video analytics test standard available in the world and it is an important question to ask a manufacturer if you are thinking of using their video analytics outdoors. i-LIDS is sponsored by the UK government but testing is open to all analytics manufacturers. (Put ‘i-LIDS testing’ into Google if you want to find out more).

John Honovich

Btw, i-LIDS is a UK only certification, so don't spin me anything about the world. And it is staged, not a head to head comparison so don't mislead people into thinking that i-LIDS supports your marketing video.

Create New Topic

Geoff Thiel

John, I feel that VCA has the right to reply. Your ‘guess’ was unfairly damaging to VCA’s reputation and our side-by-side tests, produced in controlled conditions were the best way of presenting some facts rather than just VCA’s opinion. This is the same way that IPVM tests competing products - by setting them up side-by-side, recording the results, and then comparing the performance.

This is not a VCA marketing video – you won’t find it, or any others like it, on our web site. We only decided to publish because of the need to counter your unfair comment. In fact, the main purpose of these side-by-side tests is for our R&D team to make sure that our performance is up to scratch compared to our competitors. We have a dark room where all the cameras see the same scene at the same time, under the same conditions.

Re i-LIDS being a ‘UK only’ test – this is not really the case, any manufacturer can apply and many non-UK companies have indeed done so. I accept that i-LIDS only has real traction in the UK market, where it is a mandatory requirement for government contracts. But my point stands, that if a product is i-LIDS certified it has been independently verified to have robust outdoor performance. In VCA’s case it backs up the evidence in the video we posted.

Create New Topic

Eric Fleming Bonilha

VCA engine is a great, robust and proven engine, just because you didn´t test it you can´t say otherwise...

"Guessing" is not something I would expect from you John... To be fair you should test it, put VCA side-by-side with other analytics engines such as VideoIQ and then you can express your fundamented opinion... because guesses has no fundaments at all

Create New Topic

John Honovich

Your test is inherently biased. You have an agenda for your product to come out to top. Do not compare it to our tests. Do not call it evidence.

i-Lids is run by the UK government. It is NOT a world standard. If you want to say that the UK government validates your analytics, feel free. That has nothing to do with the rest of the world nor does UK take any stance about your analytics vs VideoIQ.

As you know we have been doing surveys, getting feedback from integrators for years and we have not heard any say anything positive about your analytics. The only time I hear people use them is because yours are cheap, not better.

Create New Topic

Undisclosed #2

First off, the test claims (at the end) that it was done by playing clips from a TFT monitor, which is not a very realistic test scenario.

There are also several false conclusions drawn in the video (for example the misunderstanding of how/why VideoIQ would "lose track" of objects standing still). There is also no detail about how the systems were setup, calibrated/trained, etc.

Also, all the test cases seemed to be only around scenes where you're trying to catch particular object movements, there are no examples of dealing with challenging conditions in the sense of supressing alarms for typical false alarm conditions.

The sample video is really in no way indicative of how the two systems would generally perform over time in real-world conditions.

Additionally if you're comparing any two systems you should also factor in performance over a given range or area. In all of these examples it *appeared* that the shots were relatively narrow. If you were looking at system performance you would want to make comparisons of how much distance or area each system could cover.

Create New Topic

Rukmini Wilson

...'side by side shootout' test video on YouTube which shows VideoIQ performing worse in demanding situations.

Geoff, why do you think anyone would believe that this is anything but a selective representation of clips showing when VCA out-performed VIQ and omitting any where it did not? Its only 100 seconds of video from 5 clips total! Surely you recorded a little more, no?

We have tested quite a few competitor products this way and so far none of them has out-performed VCA on these test clips.

I believe that they did, otherwise we wouldn't see them.

But who would believe VCA beats VIQ everytime? (Except for the occasional begrudging tie thrown in...). Not only that, VCA wins everytime against all comers! Name any independent test of any product where the winner beats all competitors in all situations!

The test method ('no bubble' etc.) does not address this well-known issue of reporting bias. IMHO, it fails the credibility test as unbiased research. It does do much better as a marketing video though, just add in a voting contest at the end and its perfect.

Create New Topic

John Honovich

"We have tested quite a few competitor products this way and so far none of them has out-performed VCA on these test clips."

By the way, this is another problem with using clips and pointing a camera at a monitor. The company doing this can optimize / tune their algorithms specifically for the clips. It's like having the questions to a test beforehand. Doesn't show much about real knowledge, but does let you brag that you are smarter than others.

The world's leading video surveillance information source, IPVM provides the best reporting, testing and training for 10,000+ members globally. Dedicated to independent and objective information, we uniquely refuse any and all advertisements, sponsorship and consulting from manufacturers.