Instant tracking is an algorithm that, contrary to those previously introduced in the Wikitude SDK, does not aim to recognize a predefined target and start the tracking procedure thereafter, but immediately start tracking in an arbitrary environment.
The quality of the tracking depends on surrounding that is scanned and also on the device capabilities (e.g. if you're working with SMART - SMART is a seamless API which integrates ARKit, ARCore and Wikitude’s SLAM in a single augmented reality SDK, cross-platform, for any device. It ensures the delivery of the best possible augmented reality experience on a wider range of devices, covering 92,6% of iOS devices and about 35% of Android devices available in the market.)
Extended Tracking is used when you have an Image / Object for the initalization and then allows you to move away from the Image / Object. So the tracking extends beyond the limits of the original target image / object.
If you have further questions or issues with the quality, sending a video of the surrounding would help to give further details on how suitable the scene is.
Thx and greetings
Thank you for the information, I guess I was using the wrong terminology.
What I would like to do is use a loaded instant target to position a 3D object (persisting from a saved target) and then use instant tracking to track from there.
e.g. place a 3D object using the instant target then walk around it
What I have found is that if I move the camera away from the saved target, tracking is lost completely (the floor grid just moves along with the camera).
For example if I move to walk around the 3D object tracking will be lost, if I pan the camera right or left tracking will be lost.
Is this expected behaviour or am I missing something? I read that we cannot use SMART and instant tracking targets together - which would seem like the answer.
Thx for clarifying this and your use case. You're correct, if you work with persistant Instant Targets that you'll save and load, you'll need to work without SMART (persistent storage is only available with Wikitude's Instant Tracking).
Could you send me details on your scene and showing the experince when you scan to forum [at] wikitude.com then I can provide further details on the suitability and quality of the scene.
Thx and greetings
the scenes are basically the sample scenes
the only thing I added is an object that appears at the loaded target object position
here is a video of the tracking issues:
In the save scene the tracking is lost as I try to turn through 360 degrees
In the load scene the tracking is lost and the objects float around as I try to turn or approach the object
Is this expected behaviour? Is the instant target designed for viewing from just one angle even after it is loaded?
The desired behaviour is to create an object using the loaded instant target but then use the wikitube slam to be able to walk around the object
is this possible?
Is there a reason extended tracking is not available on loaded instant targets
I am referring specifically to the save/load instant target demo
After moving the camera away from the scene (OnSceneLost) the tracking just stops and its pretty bad for UX.
I was hoping to use OnSceneRecognized to place a virtual object and then rely on normal camera tracking to keep the object in place while the user looks around.
At the moment the object just stops tracking OnSceneLost.
Is there a work around to this?