Hi, I would add AR capabilities to my WebRTC based app; We managed to show frames but have some trouble with InstantTracker. The symptom is, I can see the local camera NV21 frames as routed from WebRTC to Wikitude via FrameInputPlugin, but InstantTracker's yellow square used to set the pose is not shown at all, and IT state change events are also never received in my InstantTrackerListener. It happens that GL thread never starts, so that GLContext never gets created. Unfortunately I'm not an expert in OpenGL and I'm still trying to find why this happens. In my app the Wikitude machinery is started not on activity onCreate, but on an event we receive over the network. I attach both a code snippet and an Android console log, I'd appreciate if you had any hint. Thanks for your help, Alessandro
OpenGL problem was a trivial one: layout.addView had been removed by mistake. Sorry... However, once fixed the layout issue, InstantTracker callbacks yet aren't called.
Alessandro Zemella
Hi, I would add AR capabilities to my WebRTC based app; We managed to show frames but have some trouble with InstantTracker. The symptom is, I can see the local camera NV21 frames as routed from WebRTC to Wikitude via FrameInputPlugin, but InstantTracker's yellow square used to set the pose is not shown at all, and IT state change events are also never received in my InstantTrackerListener. It happens that GL thread never starts, so that GLContext never gets created. Unfortunately I'm not an expert in OpenGL and I'm still trying to find why this happens. In my app the Wikitude machinery is started not on activity onCreate, but on an event we receive over the network. I attach both a code snippet and an Android console log, I'd appreciate if you had any hint. Thanks for your help, Alessandro