Hello Wikitude Team !
We are working for an internal project for Bureau Veritas France.
Our goal is to make an Android application to make some sticky notes in an industrial environment.
Our main constraint is that we need to work in an asynchronous mode. So we have to be able to save the scene on the device to load it later on the same device or another one.
We started development two weeks ago using your Android native API. After build the structure of the app we are encountering our first issues and difficulties :
- We have to draw our tag/notes in native OpenGL, it's not really friendly for us.
- We choose to use Instant tracking, so we have to go through two steps : Initialisation and Tracking that need an user interaction. For us it is not really efficient.
- We base our development on Instant tracking Wikitude exemples. Despite this, the outcome is not what we expected, tag localisation is not accurate, the scene disappears when we move just a bit.
- Moreover, when we want add a marker on a wall, this one appears very small ( perhaps because interpreted far of me). However the plane detection is enabled but it seems to be not very accurate.
The good point is that when we save a scene (locally, in the device hard memory) and load it later, the anchors (Tags) appear at nearly the same place.
Can you give us some advices to help us ? We have a lot of pressure on this project, so we need good results. Be sure that if we can get good results, we will get licences for several years.
To illustrate what we are looking to do, here is a video that I found on YouTube :
Currently, there is our result : (sorry for the long video)
Can you confirm that Instant Tracking is the good choice for our project ?
Any advice will be welcome :)
Thanks a lot
Good morning Julien,