Start a new topic

Separate tracking from rendering

Hi,

My first question (https://support.wikitude.com/support/discussions/topics/5000081332) was not clear maybe so here is a better one : 


I would like to use Wikitude for tracking, but I would like to keep my own rendering engine.

Based on the SimpleClientTracking example I saw how I can make my own external render for the 3D model, but I would like to render the background picture too.


It seems the RenderExtension does the job by itself in the GLRenderer, but is it possible and how to make me drive the background render (the camera picture)? No getting the picture, just asking for the background picture texture binding would be enough.

I really need to drive the render operations by myself to order them and make them match my custom rendering engine.


Thanks for any help you could provide me.


Regards, 

    Vincent


Hi Andreas,


When you say "after" does it mean after the method onDrawFrame() has returned (so I cannot know exactly when) or is it done in the onDrawFrame() method, before the method returns ?


Other point, is the texture in RGB ?


Thanks again.

Regards, 

   Vincent


Hi, 

I created a bitbucket repository with a test project to have the minimalist test code, I prefer to avoid setting it public, could you send me one or more emails to give read access on this repository ? 

So we can have this common code base to understand each other.


Thanks.

Regards, 

    Vincent

Hi Andreas, 


I think I call renderExtension methods as requested.

Here is an extract : 


 

package com.innersense.osmose.ar.wikitudeandroid;

import android.app.Activity;
import android.content.Context;
import android.util.Log;

import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.math.Matrix3;
import com.badlogic.gdx.math.Matrix4;
import com.badlogic.gdx.math.Quaternion;
import com.badlogic.gdx.math.Vector3;
import com.wikitude.NativeStartupConfiguration;
import com.wikitude.WikitudeSDK;
import com.innersense.osmose.ar.wikitude.ARWikitudeEngine;
import com.wikitude.common.camera.CameraSettings;
import com.wikitude.common.camera.CameraSettings.CameraPosition;
import com.wikitude.common.camera.CameraSettings.CameraResolution;
import com.wikitude.common.rendering.RenderExtension;
import com.wikitude.rendering.ExternalRendering;
import com.wikitude.tracker.ImageTarget;
import com.wikitude.tracker.ImageTracker;
import com.wikitude.tracker.ImageTrackerListener;
import com.wikitude.tracker.TargetCollectionResource;
import com.wikitude.tracker.TargetCollectionResourceLoadingCallback;

/**
 * Created by Vincent on 31/01/2017.
 */

public class ARWikitudeEngineAndroid extends ARWikitudeEngine implements ImageTrackerListener, ExternalRendering {

    private static final String TAG = "ARWikitudeEngineAndroid";


    private WikitudeSDK mWikitudeSDK;
    /**
     * Android activity instance.
     */
    private Activity activity;
    /**
     * The vuforia license key to use.
     */
    private final String wikitudeLicenseKey;
    private TargetCollectionResource mTargetCollectionResource;

    RenderExtension renderExtension;

    public ARWikitudeEngineAndroid(String datasetFileName, boolean isAbsolute, String licenseKey) {
        super(datasetFileName, isAbsolute);
        this.wikitudeLicenseKey = licenseKey;
    }


    @Override
    public void onARStart() {
        super.onARStart();

        mWikitudeSDK = new WikitudeSDK(this);
        NativeStartupConfiguration startupConfiguration = new NativeStartupConfiguration();
        startupConfiguration.setLicenseKey(wikitudeLicenseKey);
        startupConfiguration.setCameraPosition(CameraPosition.BACK);
        startupConfiguration.setCameraResolution(CameraResolution.AUTO);
        startupConfiguration.setTextureId(7); //TODO : set runtime ?

        mWikitudeSDK.onCreate(activity.getApplicationContext(), activity, startupConfiguration);

        mTargetCollectionResource = mWikitudeSDK.getTrackerManager().createTargetCollectionResource(datasetFileName, new TargetCollectionResourceLoadingCallback() {
            @Override
            public void onError(int errorCode, String errorMessage) {
                Log.e(TAG, "Error : Failed to load target collection resource. Reason: " + errorMessage);
            }

            @Override
            public void onFinish() {
                mWikitudeSDK.getTrackerManager().createImageTracker(mTargetCollectionResource, ARWikitudeEngineAndroid.this, null);
            }
        });
    }

    @Override
    public void onARResume() {
        super.onARResume();
        mWikitudeSDK.onResume();
        renderExtension.onResume();
    }

    @Override
    public void onARDestroy() {
        super.onARDestroy();
        mWikitudeSDK.clearCache();
        mWikitudeSDK.onDestroy();
    }

    @Override
    public void onARPause() {
        super.onARPause();
        mWikitudeSDK.onPause();
        renderExtension.onPause();
    }

    @Override
    public void onARConfigurationChanged() {
        super.onARConfigurationChanged();
    }

    @Override
    public void onRenderExtensionCreated(final RenderExtension renderExtension) {
	
        //set texture ID
        mWikitudeSDK.getCameraManager().setTextureId(this.renderingEngine.getImageIndex());

        this.renderExtension = renderExtension;
        this.renderExtension.onSurfaceCreated(null, null);
    }

    @Override
    public void onTargetsLoaded(ImageTracker imageTracker) {
    }

    @Override
    public void onErrorLoadingTargets(ImageTracker tracker, int errorCode, final String errorMessage) {
    }

    @Override
    public void onImageRecognized(ImageTracker tracker, final String targetName) {
    }

    @Override
    public void onImageTracked(ImageTracker imageTracker, ImageTarget imageTarget) {
    }

    @Override
    public void onImageLost(ImageTracker imageTracker, String s) {

    }

    @Override
    public void onExtendedTrackingQualityChanged(ImageTracker imageTracker, String s, int i, int i1) {

    }


	//Rendering engine main render loop
    @Override
    public void onBeginRenderFrame() {
        super.onBeginRenderFrame();

        //make a new render loop
        renderExtension.onDrawFrame(null);

    }
}

 

Do I miss some renderExtension lifecycle call ?


Just to be sure to understand well, using

startupConfiguration.setTextureId()

 is enougth to have the texture being binded instead of rendered during the onDrawFrame() call ?


Thanks.

Regards,

    Vincent

Hi Vincent,

It is done in the onDrawFrame() method, but you need for that one to return in order to be sure that it's done - sounds somehow complicated but I hope you can follow my intention.


The texture format is RGB, yes.


Best regards,

Andreas

Hi Andreas, 


Thanks a lot, it give me a very good starting point to manage my own rendering, but I still have one more question : 

If I make my plugin to manage image rendering and 3D model rendering, I'll need to use RenderExtension.onDrawFrame() to have the Wikitude loop working (capture, detection)? Will it make a render I'll need to ignore to make my own ?


Thanks.

Regards,

    Vincent


Hi Andreas,

I starting from the External Rendering example yes.

In this example, the RenderExtension is managed vi a GLSurfaceView.Renderer, a Driver and a GLSurfaceView to make it work.

Because I'm using my own rendering engine, the GLSurfaceView and the Driver are already managed in my engine.


So here is the question : to have Wikitude camera start and loop in background, do I just need to manage the RenderExtension  (i.e. GLRenderer class in the example without the strockedRectangle part, and an Activity that extends ExternalRendering to manage the renderExtension) ?


Here is what I'm trying to do : 

My Activity extends ImageTrackerListener and External Rendering.

Like the example, the WikitudeSDK is started, paused, etc... 

In onRenderExtensionCreated() method, I create a new instance of GLSurfaceView.Renderer extending RenderExtention too, once again like in the example.

It calls onDrawFrame and other methods of the renderExtension.


Last, I use my rendering loop to call myGLRenderer.onDrawFrame().


It it enough to have Wikitude camera start and run ? Or do I miss something important ?


Thanks.

Regards,

    Vincent

Hi Vincent,

I would really like to find out why it's not working for you. 


I'm looking forward to your questions/results ;)


Best regards,

Andreas

Hi Andreas.


So, here is my loop : 


  • Main 3d loop start
  • AR onDrawFrame()
  • 3D render background picture from specified index with RBG texture datas
  • 3D render other stuffs
  • Main 3D loop end

AR callbacks send me states indicating camera is well running and image detection is working, but the 3D background render do not render anything.
I used a GPU profiler, and I got a strange result : the camera image seems to be binded in GL index 0, as Luminance picture.

Could I have more details about the GL calls onDrawFrame() do please ?
It's quite hard to make 2 GL engines works together, so any details about this "back box" method content could be very useful.

Thanks.
Regards, 
    Vincent


Hi Andreas,


Thanks for the bug report, I think it will be useful to have a way to avoid Wikitude rendering the background. An idea could be to allow Wikitude to GL bind the rgb image using a user texture index, this way it is efficient to manage the render from an external renderer without having to copy the whole picture data.

We will use iOS SDK too, but in a later stage.

By the way, I sent an email as mentioned to get the lastest feature you are working on.


Waiting for it, I'll make Wikitude integration without this feature using a plugin.

Thanks a lot.


Regards,

    Vincent

Hi Vincent,

You can already pass a user generated OpenGL ES texture to the Wikitude SDK. We then use this texture to render the camera image into it. The Android API would be `CameraManager.setTextureId` or `[WTCaptureDeviceManager setCameraRenderingTargetTexture]` for iOS. Is that what you was referring to in your last reply?


Best regards,

Andreas



Hi Vincent,

The usage of the render extension seems fine.

In the snippet you send you set the texture id to `7`. Is this just for debugging/snippet or do you also use this in your production code?


Could you try to use CameraManager.setTextureId()? You can call this once the Wikitude SDK is started.


To speed up things, would it be possible that you send us a minimalistic demo project so that we can have a look at it?


Best regards,

Andreas

Hi Vincent,

The only two things you need to care about is calling our SDK's lifecycle methods correctly and call the render extension `onDrawFrame` within your engine environment (every frame). The rest is example specific and you don't need to care about it (surface view, renderer).


Best regards,

Andreas 

Hi Vincent,

I responded via email - more information asap.


Best regards,

Andreas

Hi Andreas, 

Yes it looks like what I'm looking for.

I need a little more work to try to make Wikitude integration in our system, but it can be a way of making it work.

Thanks.

Regards,

  Vincent

Hi Vincent,

I'm a bit confused now. Are you passing your own OpenGL ES 2 texture to the Wikitude SDK to render the camera image into it or do you rely on the RenderExtension to render the camera image? My confusion comes from your last message and the mentioned OpenGL ES texture index 0.


A short summary: 

* If you provide us with an OpenGL ES 2 texture, you need to render it yourself. This texture should be in the RGB format. Internally we do a offscreen rendering from our internal camera frame into the given OpenGL ES 2 texture.

* In case you use the render extension, we draw the texture into whichever OpenGL ES 2 context is currently bound directly to the screen.


In both cases we have another texture that contains the original camera frame that we captured. So maybe this OpenGL ES 2 texture with index 0 is the one that we created internally.


Does this help?

Andreas

Login or Signup to post a comment