Hi,
Thanks for your reply.
I've seen the CustomCamera sample in the javascript SDK, but it looks the same as the one in the native SDK, and renders an orange rectange natively. Is it possible to let the javascript framework render a 3D object that has been setup in the studio ? If so, could you explain a bit how to achieve it ?
Thanks
Hi Laurent,
The green screen you are describing when requestsInputFrameRendering is returning true is a known issue introduced with SDK 6.0 on android. This will be fixed in one of the next releases.
What you can do until this is fixed is render the camera image yourself.
You are right about our custom camera sample. It renders everything natively.
This is used to show the possibilities you have with the input plugins API but it is also possible to use the ARchitectView to render drawables(e.g. 3D models).
This is also independent of the requestsInputFrameRendering return value.
To see how you can display a 3D model on a target image in the ARchitectView take a look at the 3D model sample code and the 3D model documentation. You can then apply this to your ARchitectView.
Best regards,
Alex
Hi Laurent,
the InputPlugin color rendering issue is planned to be resolved with release 6.1..
This issue causes wrong colors to be rendered but you should not get a completely green screen.
Please make sure to set the input frame size during 'initialize' of the plugin by using
getFrameSettings().setInputFrameSize({width, height});
Best Regards,
Alex
Hi Laurent,
If you are using the code from our CustomCameraActivity from the sample app you need to move the setFrameSize call outside of the runOnUiThread block otherwise it will be set to late for internal rendering.
Best Regards,
Alex
Hi Laurent,
It is expected that the InputPlugin starts with a short black screen, what is not expected is that the rendering is starting only after tapping the screen.
I have never experienced this behaviour.
Did you only change the YUVFrameInputPlugin.cpp and the position of the setFrameSize in the CustomCameraActivity?
With Release 6.1. we are also introducing a new simple InputPlugin sample which uses a custom camera to get the frames and lets the Wikitude SDK render it.
It seems like this is exactly what you want to accomplish.
Best Regards,
Alex
Hi Laurent,
The 6.1 release is planned for the end of march.
You can download and try a pre-release version of it here.
Please note that this is not a final version.
The camera controls you need will not be supported with 6.1..
We do not have those features on our roadmap yet, so we can’t provide any details on if/when this feature will be implemented
Best Regards,
Alex
Laurent Latour
Hi,
My goal is to use the wikitude SDK to render 3D object on an image target (using the javascript for android could be very nice), but I need to control the camera by myself. I've seen that controlling the camera is possible in the CustomCamera Plugins API sample, but in this sample the rendering is done in a native way.. Is it possible to use the ARchitectView in combination with a native api ?
Thanks