I am playing with the InstantTracker and wanted to add some custom interaction for points in the point cloud list. The documentation lacks more detailed information so I hoped someone can help me out here. :)
However, since this cloud can consist of many points (including those that are currently outside of the current view) I was wondering if there are any specifics of that point cloud that can help me to optimize for better performance? I am specifically thinking of points order, etc. so I could limit the search scope to improve efficiency.
Also, as of today, these points are just a feature points mapped into 3D space, right? Is there a way to get such information like average color around the point (taken from the camera texture)?
I believe this could be achieved by raycasting the 3D point into the screen point (or near clipping plane) and then taking pixels from the camera texture? Are there any risks that the cached camera texture and requested point cloud are out of sync?
Thanks for help,
The points in the instant tracking point cloud are in no specific order.
You are right with your assumption that these points are mapped feature points. Currently we don't offer an API to get the average color around a point. Regarding synchronisation: For now it's not guaranteed that the camera frame ( that is currently rendered) is the one that was used to generate the point cloud from. The offset could be ~1 frame. Unfortunately there is no API to get this exact offset.
Could you explain your use case in more detail so that we get an idea of how we could improve our API?