For our project, we are looking to place content by recognising an Object trackable with Wikitude. After the recognition has happened, we wish to hand it off to AR Foundation so we can look around and view content placed around the scanned object — without having to keep the object in view.
We followed the instructions provided by Gökhan here: https://support.wikitude.com/support/discussions/topics/5000095172
As far as we understand this is how "extended tracking" should be handled now.
Our issue is as follows: we can scan the object and place content correctly. However, as soon as we move the camera in a way that the object is no longer visible (OnObjectLost is called) with Wikitude, the content behaves strangely and is no longer tracked in the environment but moves with the camera.
It appears that the content is not properly placed in AR Foundation space, or at least not in the correct position as it seems a lot farther away.
I've attached a video where I place a cube after scanning part of the desk (object tracking). I then click a button that calls the "OnObjectLost" method which simulates losing tracking, and the object is now not anchored to the AR Foundation planes, like it's placed several metres farther away.
Thank you for your help.
what are you trying to track? it seems that the scale of the real image/object is off compared to the scale defined in the target collection and therefore the augmentation is placed at a distance...
That was exactly it... it seems I forgot to adjust the scale of the object target and the objects were placed at a distance. I have updated my desk (test) scan and it works perfectly now. It should work correctly too for our real-world case (scanning interior of a vehicle).
Thank you so much!