Start a new topic

Separate tracking from rendering

Hi,

My first question (https://support.wikitude.com/support/discussions/topics/5000081332) was not clear maybe so here is a better one : 


I would like to use Wikitude for tracking, but I would like to keep my own rendering engine.

Based on the SimpleClientTracking example I saw how I can make my own external render for the 3D model, but I would like to render the background picture too.


It seems the RenderExtension does the job by itself in the GLRenderer, but is it possible and how to make me drive the background render (the camera picture)? No getting the picture, just asking for the background picture texture binding would be enough.

I really need to drive the render operations by myself to order them and make them match my custom rendering engine.


Thanks for any help you could provide me.


Regards, 

    Vincent


Hi Vincent,

I responded via email - more information asap.


Best regards,

Andreas

Hi Andreas, 

I found something about renderExtension.onUpdate() and onDrawFrame(), I did the same from the from here http://www.wikitude.com/external/doc/documentation/latest/androidnative/renderingnative.html#external_rendering_api

Still not rendering the background picture from camera... don't really understand where is the problem...


Thanks

Regards, 

    Vincent

Hi,

I gave you access on the repository, don't hesitate if you have any questions.


I did not found documentation about separating rendering from tracking on the last version, did you wrote anything about it ?


Thanks.

Regards, 

    Vincent

Hi Andreas,

I just send an email.


I downloaded the last release, I'm upgrading the project with Sdk 2.1.0 so I could test the lastest features (separate rendering from tracking)

Thanks.


Regards, 

     Vincent

Hi Vincent,

We released an update of our Native SDK just yesterday. This update contains the separation of logic and render updates in the RenderExtension class. Maybe this already helps you.

Anyhow, I would be interested to see how you use the Wikitude SDK in your application, so if you could give me read access, that would be nice, thx. Simply write to android-sdk@wikitude.com and refer to this thread. It will then be forwarded to me and I get back to you asap with more details.


THX,

Andreas

Hi, 

I created a bitbucket repository with a test project to have the minimalist test code, I prefer to avoid setting it public, could you send me one or more emails to give read access on this repository ? 

So we can have this common code base to understand each other.


Thanks.

Regards, 

    Vincent

Hi Vincent,

I would really like to find out why it's not working for you. 


I'm looking forward to your questions/results ;)


Best regards,

Andreas

Hi  Andreas,


Yes, the texture id to "7" si for debug/test usage only. I'll use setTextureID too to be sure, with the same index.


It would be difficult for me to create a little project since our rendering engine is not designed to be working like that, but I'l think about it, it could make us gain some important time.


Thanks again, I'll be back here with more questions or maybe some good results ASAP !


Regards,

    Vincent

Hi Vincent,

The usage of the render extension seems fine.

In the snippet you send you set the texture id to `7`. Is this just for debugging/snippet or do you also use this in your production code?


Could you try to use CameraManager.setTextureId()? You can call this once the Wikitude SDK is started.


To speed up things, would it be possible that you send us a minimalistic demo project so that we can have a look at it?


Best regards,

Andreas

Hi Andreas, 


I think I call renderExtension methods as requested.

Here is an extract : 


 

package com.innersense.osmose.ar.wikitudeandroid;

import android.app.Activity;
import android.content.Context;
import android.util.Log;

import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.math.Matrix3;
import com.badlogic.gdx.math.Matrix4;
import com.badlogic.gdx.math.Quaternion;
import com.badlogic.gdx.math.Vector3;
import com.wikitude.NativeStartupConfiguration;
import com.wikitude.WikitudeSDK;
import com.innersense.osmose.ar.wikitude.ARWikitudeEngine;
import com.wikitude.common.camera.CameraSettings;
import com.wikitude.common.camera.CameraSettings.CameraPosition;
import com.wikitude.common.camera.CameraSettings.CameraResolution;
import com.wikitude.common.rendering.RenderExtension;
import com.wikitude.rendering.ExternalRendering;
import com.wikitude.tracker.ImageTarget;
import com.wikitude.tracker.ImageTracker;
import com.wikitude.tracker.ImageTrackerListener;
import com.wikitude.tracker.TargetCollectionResource;
import com.wikitude.tracker.TargetCollectionResourceLoadingCallback;

/**
 * Created by Vincent on 31/01/2017.
 */

public class ARWikitudeEngineAndroid extends ARWikitudeEngine implements ImageTrackerListener, ExternalRendering {

    private static final String TAG = "ARWikitudeEngineAndroid";


    private WikitudeSDK mWikitudeSDK;
    /**
     * Android activity instance.
     */
    private Activity activity;
    /**
     * The vuforia license key to use.
     */
    private final String wikitudeLicenseKey;
    private TargetCollectionResource mTargetCollectionResource;

    RenderExtension renderExtension;

    public ARWikitudeEngineAndroid(String datasetFileName, boolean isAbsolute, String licenseKey) {
        super(datasetFileName, isAbsolute);
        this.wikitudeLicenseKey = licenseKey;
    }


    @Override
    public void onARStart() {
        super.onARStart();

        mWikitudeSDK = new WikitudeSDK(this);
        NativeStartupConfiguration startupConfiguration = new NativeStartupConfiguration();
        startupConfiguration.setLicenseKey(wikitudeLicenseKey);
        startupConfiguration.setCameraPosition(CameraPosition.BACK);
        startupConfiguration.setCameraResolution(CameraResolution.AUTO);
        startupConfiguration.setTextureId(7); //TODO : set runtime ?

        mWikitudeSDK.onCreate(activity.getApplicationContext(), activity, startupConfiguration);

        mTargetCollectionResource = mWikitudeSDK.getTrackerManager().createTargetCollectionResource(datasetFileName, new TargetCollectionResourceLoadingCallback() {
            @Override
            public void onError(int errorCode, String errorMessage) {
                Log.e(TAG, "Error : Failed to load target collection resource. Reason: " + errorMessage);
            }

            @Override
            public void onFinish() {
                mWikitudeSDK.getTrackerManager().createImageTracker(mTargetCollectionResource, ARWikitudeEngineAndroid.this, null);
            }
        });
    }

    @Override
    public void onARResume() {
        super.onARResume();
        mWikitudeSDK.onResume();
        renderExtension.onResume();
    }

    @Override
    public void onARDestroy() {
        super.onARDestroy();
        mWikitudeSDK.clearCache();
        mWikitudeSDK.onDestroy();
    }

    @Override
    public void onARPause() {
        super.onARPause();
        mWikitudeSDK.onPause();
        renderExtension.onPause();
    }

    @Override
    public void onARConfigurationChanged() {
        super.onARConfigurationChanged();
    }

    @Override
    public void onRenderExtensionCreated(final RenderExtension renderExtension) {
	
        //set texture ID
        mWikitudeSDK.getCameraManager().setTextureId(this.renderingEngine.getImageIndex());

        this.renderExtension = renderExtension;
        this.renderExtension.onSurfaceCreated(null, null);
    }

    @Override
    public void onTargetsLoaded(ImageTracker imageTracker) {
    }

    @Override
    public void onErrorLoadingTargets(ImageTracker tracker, int errorCode, final String errorMessage) {
    }

    @Override
    public void onImageRecognized(ImageTracker tracker, final String targetName) {
    }

    @Override
    public void onImageTracked(ImageTracker imageTracker, ImageTarget imageTarget) {
    }

    @Override
    public void onImageLost(ImageTracker imageTracker, String s) {

    }

    @Override
    public void onExtendedTrackingQualityChanged(ImageTracker imageTracker, String s, int i, int i1) {

    }


	//Rendering engine main render loop
    @Override
    public void onBeginRenderFrame() {
        super.onBeginRenderFrame();

        //make a new render loop
        renderExtension.onDrawFrame(null);

    }
}

 

Do I miss some renderExtension lifecycle call ?


Just to be sure to understand well, using

startupConfiguration.setTextureId()

 is enougth to have the texture being binded instead of rendered during the onDrawFrame() call ?


Thanks.

Regards,

    Vincent

Hi Vincent,

Until Wikitude SDK 6.1 is released, you need to call the renderExtension in both cases as it also drives our internal update logic.

You simply treat the texture you created for the Wikitude SDK as any other texture that you want to render. The Wikitude SDK only takes care about filling it with the appropriate content and the content is only updated if you call the renderExtension with all it's methods correctly. It's just that the render extension does not render the camera image to the screen if a specific target texture is set through the APIs you mentioned before.


The difference between the two APIs you mentioned is that the second one allows you to change the texture id at runtime wile the first one uses that texture from start to end (You would need the first behaviour in case you need to change texture size because you e.g. switched between front and back cam that run with different resolutions).


Best regards,

Andreas 

Hi,

It helps, but still some doubts : 


According to your summary, I can choose to use the renderExtention or not ? I understood from our previous exchanges that I need to manage the renderExtension instance to call onDrawFrame() at least to have the AR working.


Here is what I do : 

I create a new texture in my rendering engine, and I set Wikitude camera manager to use the same texture index. I just expect from my AR engine to bind the camera picture to this context, the rendering engine managing the creation (in RGB888), render, etc...


So, let's start from your first case : I provide the RGB texture, I render it. What do I need to do in my Wikitude AR library to  have this offscreen rendering (i.e. texture binding) ?


The texture I saw in index 0 is probably the Android capture of the camera in Luminance you use internally.



Last point : 

To set Wikitude opengl context, I have 

startupConfiguration.setTextureId()


and


mWikitudeSDK.getCameraManager().setTextureId()

 

Is there any difference ? 



Thanks for you help.

Regards, 

    Vincent

Hi Vincent,

I'm a bit confused now. Are you passing your own OpenGL ES 2 texture to the Wikitude SDK to render the camera image into it or do you rely on the RenderExtension to render the camera image? My confusion comes from your last message and the mentioned OpenGL ES texture index 0.


A short summary: 

* If you provide us with an OpenGL ES 2 texture, you need to render it yourself. This texture should be in the RGB format. Internally we do a offscreen rendering from our internal camera frame into the given OpenGL ES 2 texture.

* In case you use the render extension, we draw the texture into whichever OpenGL ES 2 context is currently bound directly to the screen.


In both cases we have another texture that contains the original camera frame that we captured. So maybe this OpenGL ES 2 texture with index 0 is the one that we created internally.


Does this help?

Andreas

Hi Andreas.


So, here is my loop : 


  • Main 3d loop start
  • AR onDrawFrame()
  • 3D render background picture from specified index with RBG texture datas
  • 3D render other stuffs
  • Main 3D loop end

AR callbacks send me states indicating camera is well running and image detection is working, but the 3D background render do not render anything.
I used a GPU profiler, and I got a strange result : the camera image seems to be binded in GL index 0, as Luminance picture.

Could I have more details about the GL calls onDrawFrame() do please ?
It's quite hard to make 2 GL engines works together, so any details about this "back box" method content could be very useful.

Thanks.
Regards, 
    Vincent


Hi Vincent,

It is done in the onDrawFrame() method, but you need for that one to return in order to be sure that it's done - sounds somehow complicated but I hope you can follow my intention.


The texture format is RGB, yes.


Best regards,

Andreas

Login or Signup to post a comment