Start a new topic

How are duplicate image targets handled in the rest api?

Does the rest api check uploaded target images for duplicates, and not insert them in a target collection if they already exist? There will inevitably be some duplicates when dealing with feeds from publishers. If a new image target is not added, and we don't know that, it will throw off the posting/expiring cycle, and not implement updated metadata. We understand that you don't want duplicates in target collections. But is it possible to disable the old version instead? Thanks.

Hi Christopher,

The API only checks for identical images, meaning exact same target-files. A "DUPLICATE_TARGETIMAGE_USAGE" (error code 400, documentation will be updated on next deployment) will be thrown on target creation.

From your description, the "Similarity Check" Endpoint may help on that subject. But note that its's a crucial one. Usually, an image with a similarity score of 0.1 and below is safe to use, also in a target collection of 10k+.
Scores larger than 0.4 indicate that some areas of the images may get mixed up with at least one existing image (e.g. the top part of your image looks similar to the bottom part of an existing one = at least in terms of extracted features).
In case you plan to scale big and e.g. offer the user a way to augment their own targets, where mixing of augmentations is very crucial, I recommend you to reject images with a similarity score of 0.2+.In case you manage content on your own and there is no privacy issue in case you augment the wrong target, images below a score of 0.4 are fine to use.
At this point, I have to highlight that the similarity score comes without any guarantee. Especially in terms of privacy and protection of sensitive content I highly recommend to find other ways of separating and protecting user content - E.g. authenticate the user and bind users to isolated target collections and protect augmentation download on a session level.

Disabling targets is not possible, you have to delete them and trigger the cloudarchive generation in case you modify the collection via the API.

Best regards,
Andreas Fötschl

If I understand this correctly, you have just made your product unacceptable to publishers. We need to be able to *reliably* post new content and delete old content upon its scheduled experation date. New content may have very subtle visual changes, such as small text in an ad that changes the terms of an offer. There may be monthly coupons that are visually identical except for a new expiration date or tracking code -- but the metadata used in the augmentation may change. I strongly suggest you make this "duplicate" check go into effect only upon an active change in account configuration. We will not be proceeding with any further integration work with your system if we cannot reliably upload new content. Thank you.

@Wikitude: if a developer uploads duplicate images, can you please allow them to choose what happens? 

  • Replace the old image(s) with the new image(s).
  • Keep the old image(s) and cancel adding the new image(s) to the cloud database.


The option we'd like is not to delete duplicates at all, so our content database matches the Wikitude target collection. (We avoid duplicates as much as possible, but there will inevitably be some.) If there are duplicate/near-duplicate image targets from a scan, return the most recent of them.

This is a good example for the variety of possible use cases of cloud recognition.

While others have to strictly avoid a mismatch of similar targets, your ideal cloud recognition mechanism should be lazy and return the most recent in case others look similar.

However, there is an easy way to map your requirements to the existing Wikitude CloudReco solution.

Please have a look at the meta-data attribute which you can optionally set per target (compare related SDK Sample). Assuming that you frequently update your vouchers, you may apply the following workflow for the "time relevant" ones (T):

  1. Upload all voucher images that look somehow similar to voucher T, e.g. different language, seasonal customizations, ... (T1, T2)
  2. Whenever there is a new "release" of a voucher, you update the metadata of all relevant vouchers (T1, T2)
  3. Within your application you parse the metadata field to trigger the right action after a successful scan. You basically rely on the meta-data of Tx and ignore the "target-name".
  4. You may define a naming convention of all possible T targets (e.g. a special prefix, so you can search for all Tx targets, by querying T via search-term (getTargets parameter s).

Customers already used this mechanism to properly augment magazines which are shipped in different languages.

Hope this helps,

Best regards,
Andreas Fötschl

No. This does not even remotely match how our clients (daily newspapers) manage content. We will be terminating work integrating with Wikitude if you do not provide a viable alternative to this unacceptable policy of deleting newly uploaded content if your system decides there is an existing duplicate.

Hi Ryan,

As written in my previous post, there is no need to delete potentially duplicate assets, it's about defining the right mechanism to map your requirements to the existing generic cloud recognition.

You have to arrange content and meta-data properly to take the right actions once a new voucher-action is released.

Your app needs to use cloud-recognition to find out which voucher-type (out of a set of similar vouchers) is scanned.

I hope you find a way to implement the desired app-behaviour. Please understand that we cannot provide "in-built" features for any use case but instead try to give customers advise during their previous concept and integration work via the forum and a ticketing system.

Best regards,

Login or Signup to post a comment