This leaves me a file for testing in my own activity.
About the loading: As I said I tried both. And I don't see anything. My logcat doesn't show any errors about loading or anything else. But I do not see any lines, squares or other demo augmentations.
over 4 years ago
the example "1.3 Tracking 3D" does delete the recorded tracking map when you quit the sample. Please use the "Tracking Map Recorder" in the right upper corner of the Native SDK Example App to record a map and save it.
To load a map in the same app you've recorded it, load it like you did here:
File file = new File(mWikitudeSdk.getTrackingMapRecorder().getTrackingMapStorageLocation(), "testcv.wtm"); ClientTracker tracker = mWikitudeSdk.getTrackerManager().create3dClientTracker(file.getAbsolutePath());
To load a map you recorded in some app, exported, and packaged as asset in another app load it like you did here:
I noticed there are examples for client recognition, rendering etc. But they are all using 2d recognition. The only examples for 3D recognition are recording and recording+recognition. I have trouble creating my own activity that only does the recognition.
I am looking for an example + documentation of something like the ExternalRendering example but for point cloud recognition.