Start a new topic

Separate tracking from rendering

Hi,

My first question (https://support.wikitude.com/support/discussions/topics/5000081332) was not clear maybe so here is a better one : 


I would like to use Wikitude for tracking, but I would like to keep my own rendering engine.

Based on the SimpleClientTracking example I saw how I can make my own external render for the 3D model, but I would like to render the background picture too.


It seems the RenderExtension does the job by itself in the GLRenderer, but is it possible and how to make me drive the background render (the camera picture)? No getting the picture, just asking for the background picture texture binding would be enough.

I really need to drive the render operations by myself to order them and make them match my custom rendering engine.


Thanks for any help you could provide me.


Regards, 

    Vincent


Hi,

It helps, but still some doubts : 


According to your summary, I can choose to use the renderExtention or not ? I understood from our previous exchanges that I need to manage the renderExtension instance to call onDrawFrame() at least to have the AR working.


Here is what I do : 

I create a new texture in my rendering engine, and I set Wikitude camera manager to use the same texture index. I just expect from my AR engine to bind the camera picture to this context, the rendering engine managing the creation (in RGB888), render, etc...


So, let's start from your first case : I provide the RGB texture, I render it. What do I need to do in my Wikitude AR library to  have this offscreen rendering (i.e. texture binding) ?


The texture I saw in index 0 is probably the Android capture of the camera in Luminance you use internally.



Last point : 

To set Wikitude opengl context, I have 

startupConfiguration.setTextureId()


and


mWikitudeSDK.getCameraManager().setTextureId()

 

Is there any difference ? 



Thanks for you help.

Regards, 

    Vincent

Login or Signup to post a comment