Hi Laurent,
I'm not sure if you refer to this thread but this would be my initial answer.
We do crop the camera frame in an 'aspect fill' manner, meaning that the camera frame is rendered without any black borders into the OpenGL view. So when our ArchitectView is, through the Activity, shown fullscreen at any time, we get a 'onSurfaceChanged' event and update the ration between camera feed and OpenGL view accordingly. These values are available in our plugin API (Frame::getScaledWidth, Frame::getScaledHeight where Frame is a object passed to the Plugin::cameraFrameAvailable method).
Does this answer all your questions?
Best regards,
Andreas
Laurent Latour
Hi,
I'm using wikitude in my app for detecting a target position in camera frame using view and projection matrices, and I have a question regarding the new phones with non 16:9 screens:
On the samsung S8, the user has the ability to set the application to fullscreen mode. When it is enabled, the preview fits the entire screen, and so is cropped on the left and right sides (when in portrait). In my app I uses the matrices to crop in the high resolution captured image, wich works fine if I let the application in 16:9, but if I switch it to fullscreen, the crop becomes larger than my image target, and that's because the preview is cropped, and the coordinates are from the screen reference, not the camera frame.
So my question is: how does wikitude handles the camera frame crop/stretch for display ? Does it assumes that if the camera frame is larger than the screen, it'll be cropped ? Which dimension do you take for reference ?
Thanks,
Laurent