You'd need to work with indoor positioning.
Thx and greetings
Pleaes make sure to use our Epson SDK if you test on the Epson glasses - reason behind this is that the Epson SDK has specific calibrtion mechanisms to position the augmentations properly in the glasses. The xamarin SDK is not optimized for the glasses and we don't recommend to use it with glasses.
As for the navigational part - the SDK can deal with location data that comes from different sources (e.g. GPS or for indoor positioning / navigation some indoor location SDK). So you can e.g. work with beacons + an indoor positioning navigation provider and integrate this SDK with your app as well as ours.
Thx and greetings
Do you think that Wikitude SLAM would be enough to do what the use case says?, I mean without any indoor location solution.
In my company we are working with BT-300. Epson recommended your SDK among other but we still not sure if we would be able to fit the main requirement using your SDK along with BT-300 device.
I have the requirement to create a Logistic/Warehousing application that allows operators to move around the warehouse placing packages from one place to another place. The application should guide the operator using AR, showing arrows and drawing spots to place/pick the package.
To be more clear here es a video that shows what I mean:
https://www.youtube.com/watch?v=ZWsBHISOqjA (from 00:35 to 00:45).
We run your samples using different platforms: Xamarin/ X. Forms, Unity, Native with Android Studio. For some reason all of them (except the project in android studio) shown the camera on the screen, and all the objects were placed above the camera. Since BT-300 is a glass, and not a Smart Phone, I would expect that object were placed without the camera.
I want to know if we actually need to end up using beacons solution, fingerprints (wifi), or UWB to achieve this goal. Maybe we don't need any indoor location solution to get this done. I would really appreciate if you point us in the right direction.
Thanks for your support.