Start a new topic

Separate tracking from rendering

Hi,

My first question (https://support.wikitude.com/support/discussions/topics/5000081332) was not clear maybe so here is a better one : 


I would like to use Wikitude for tracking, but I would like to keep my own rendering engine.

Based on the SimpleClientTracking example I saw how I can make my own external render for the 3D model, but I would like to render the background picture too.


It seems the RenderExtension does the job by itself in the GLRenderer, but is it possible and how to make me drive the background render (the camera picture)? No getting the picture, just asking for the background picture texture binding would be enough.

I really need to drive the render operations by myself to order them and make them match my custom rendering engine.


Thanks for any help you could provide me.


Regards, 

    Vincent


Hi Vincent,

You encountered a implementation detail of the Android Native SDK. It doesn't clearly separates updating it's internal logic from rendering. In case you also use the Wikitude iOS SDK, you will see that you can update logic and rendering separately. 

So far there is nothing you can do against this, but I create a internal bug report and we will fix this with our SDK 2.1 which is likely to be released end of March. In case you are interested in early pre-release builds, simply write a mail to info@wikitude.com and mention this thread.


Best regards,

Andreas


1 person likes this

Hi Vincent,

We released an update of our Native SDK just yesterday. This update contains the separation of logic and render updates in the RenderExtension class. Maybe this already helps you.

Anyhow, I would be interested to see how you use the Wikitude SDK in your application, so if you could give me read access, that would be nice, thx. Simply write to android-sdk@wikitude.com and refer to this thread. It will then be forwarded to me and I get back to you asap with more details.


THX,

Andreas

Hi, 

So, here is my main Wikitude class, is it enough ?

 

package com.innersense.osmose.ar.wikitudeandroid;

import android.app.Activity;
import android.content.Context;
import android.util.Log;

import com.badlogic.gdx.Gdx;
import com.wikitude.NativeStartupConfiguration;
import com.wikitude.WikitudeSDK;
import com.innersense.osmose.ar.wikitude.ARWikitudeEngine;
import com.wikitude.common.camera.CameraSettings;
import com.wikitude.common.camera.CameraSettings.CameraPosition;
import com.wikitude.common.camera.CameraSettings.CameraResolution;
import com.wikitude.common.rendering.RenderExtension;
import com.wikitude.rendering.ExternalRendering;
import com.wikitude.tracker.ImageTarget;
import com.wikitude.tracker.ImageTracker;
import com.wikitude.tracker.ImageTrackerListener;
import com.wikitude.tracker.TargetCollectionResource;
import com.wikitude.tracker.TargetCollectionResourceLoadingCallback;


public class ARWikitudeEngineAndroid implements ImageTrackerListener, ExternalRendering {

    private static final String TAG = "ARWikitudeEngineAndroid";

    private WikitudeSDK mWikitudeSDK;
    /**
     * Android activity instance.
     */
    private Activity activity;
    /**
     * The vuforia license key to use.
     */
    private final String wikitudeLicenseKey;
    private TargetCollectionResource mTargetCollectionResource;

    private RenderExtension renderExtension;

    public ARWikitudeEngineAndroid(String datasetFileName, boolean isAbsolute, String licenseKey) {
        super(datasetFileName, isAbsolute);
        this.wikitudeLicenseKey = licenseKey;
    }


    @Override
    public void onARStart() {
        super.onARStart();

        mWikitudeSDK = new WikitudeSDK(this);
        NativeStartupConfiguration startupConfiguration = new NativeStartupConfiguration();
        startupConfiguration.setLicenseKey(wikitudeLicenseKey);
        startupConfiguration.setCameraPosition(CameraPosition.BACK);
        startupConfiguration.setCameraResolution(CameraResolution.AUTO);

        mWikitudeSDK.onCreate(activity.getApplicationContext(), activity, startupConfiguration);

        mTargetCollectionResource = mWikitudeSDK.getTrackerManager().createTargetCollectionResource(datasetFileName, new TargetCollectionResourceLoadingCallback() {
            @Override
            public void onError(int errorCode, String errorMessage) {
                Log.e(TAG, "Error : Failed to load target collection resource. Reason: " + errorMessage);
            }

            @Override
            public void onFinish() {
                mWikitudeSDK.getTrackerManager().createImageTracker(mTargetCollectionResource, ARWikitudeEngineAndroid.this, null);
            }
        });
    }

    @Override
    public void onARResume() {
        super.onARResume();
        mWikitudeSDK.onResume();
    }

    @Override
    public void onARDestroy() {
        super.onARDestroy();
        mWikitudeSDK.clearCache();
        mWikitudeSDK.onDestroy();
    }

    @Override
    public void onARPause() {
        super.onARPause();
        mWikitudeSDK.onPause();
    }

    @Override
    public void onARConfigurationChanged() {
        super.onARConfigurationChanged();
    }

    @Override
    public void onRenderExtensionCreated(final RenderExtension renderExtension) {

        this.renderExtension = renderExtension;
        this.renderExtension.onSurfaceCreated(null, null);
    }

    @Override
    public void onTargetsLoaded(ImageTracker imageTracker) {
    }

    @Override
    public void onErrorLoadingTargets(ImageTracker tracker, int errorCode, final String errorMessage) {
    }

    @Override
    public void onImageRecognized(ImageTracker tracker, final String targetName) {
    }

    @Override
    public void onImageTracked(ImageTracker imageTracker, ImageTarget imageTarget) {
    }

    @Override
    public void onImageLost(ImageTracker imageTracker, String s) {
    }

    @Override
    public void onExtendedTrackingQualityChanged(ImageTracker imageTracker, String s, int i, int i1) {

    }

	/*
	 * HERE is the 3D main render loop event
	*/
    @Override
    public void onBeginRenderFrame() {
        super.onBeginRenderFrame();

        //make a new render loop
        renderExtension.onDrawFrame(null);
    }
}

 

Thanks a lot !

Regards,

    Vincent

Hi Andreas,

I just send an email.


I downloaded the last release, I'm upgrading the project with Sdk 2.1.0 so I could test the lastest features (separate rendering from tracking)

Thanks.


Regards, 

     Vincent

Hi Vincent,

Until Wikitude SDK 6.1 is released, you need to call the renderExtension in both cases as it also drives our internal update logic.

You simply treat the texture you created for the Wikitude SDK as any other texture that you want to render. The Wikitude SDK only takes care about filling it with the appropriate content and the content is only updated if you call the renderExtension with all it's methods correctly. It's just that the render extension does not render the camera image to the screen if a specific target texture is set through the APIs you mentioned before.


The difference between the two APIs you mentioned is that the second one allows you to change the texture id at runtime wile the first one uses that texture from start to end (You would need the first behaviour in case you need to change texture size because you e.g. switched between front and back cam that run with different resolutions).


Best regards,

Andreas 

Hi, I continue asking questions here since it's the same topic :


I'm using the External Renderer, but I still have doubts : What is the minimal code I need to maintain to have Wikitude work ? (camera capture and image tracking)

From the example, I removed the GLSurface and the Driver, only kept GLRenderer to manage the RenderExtension and call onDrawFrame from my own 3D rendering engine loop. It this a good start ? Or am I missing something ? 

The Api documentation does not help since it is minimalist, and the documentation does not give so much more answers.


Thanks a lot !


Regards, 

    Vincent.

Hi Vincent,

This looks good so far but if you compare your code with the GLRenderer, you notice that the render extension also has three other methods that you need to call, being onSurfaceChanged, onPause, onResume. Maybe you already call them somewhere else but in the snippet you send I couldn't find them.


Best regards,

Andreas

Hi, 

Yes i did add the onPause, onResume and onSurfaceChanged.

The camera seems to start and detect the picture well :-)


But, now I'm trying to render the camera picture.

My first idea is to avoid using a plugin that will give me a picture I need to copy before binding it.

I prefer to render direclty the specified GL index picture.


Does renderExtension.onDrawFrame(null); do a camera picture render ? If this method bind() the picture, my render could use the binded content to render it maybe, it would be very useful and quick.

So could I get an idea of what is done in this call ?


Thanks.

Regards, 

   Vincent

Hi Vincent,

The Wikitude Native SDK would then be the product of choice. If you have a look at the `custom camera` example, all your questions should be answered as this example does almost exactly what you need (except that it injects it's own camera feed into the Wikitude SDK).


Best regards,

Andreas

Hi,

I gave you access on the repository, don't hesitate if you have any questions.


I did not found documentation about separating rendering from tracking on the last version, did you wrote anything about it ?


Thanks.

Regards, 

    Vincent

Hi Vincent,

I'm not quite sure what you're exactly asking for. Our native examples only contain the minimal amount of code to do what they do. 

I guess for you case that's the 'External Rendering' example. You need to call the Wikitude RenderExtension in any kind of update loop so that we can update image tracking. The camera updates itself in the background, but we need the render extension called in order to update the camera rendering (either to a texture or to the screen).


Best regards,

Andreas

Hi Andreas, 

I found something about renderExtension.onUpdate() and onDrawFrame(), I did the same from the from here http://www.wikitude.com/external/doc/documentation/latest/androidnative/renderingnative.html#external_rendering_api

Still not rendering the background picture from camera... don't really understand where is the problem...


Thanks

Regards, 

    Vincent

Hi  Andreas,


Yes, the texture id to "7" si for debug/test usage only. I'll use setTextureID too to be sure, with the same index.


It would be difficult for me to create a little project since our rendering engine is not designed to be working like that, but I'l think about it, it could make us gain some important time.


Thanks again, I'll be back here with more questions or maybe some good results ASAP !


Regards,

    Vincent

Hi Vincent,

After the onDrawFrame() was called, the OpenGL texture that you set in the CameraManager should have it's content updated so that you can draw it in your rendering engine. It does call glBindTexture to render the current camera image into this given texture.


Best regards,

Andreas

Login or Signup to post a comment