I am trying to share my AR scene through a video call, (you can think of it as vuforia chalk, I am okay if my receiver receives a 2D video stream, but the client making the call should be in a proper 3D AR scene, I am using unity.
Is this possible via wikitude SDK, or ARtoolkit and ARcore it is consuming ?
do I get it right that Device A should augment a scene and send a stream of the scene including augmentation to Device B. Device B doesn't have the possibility to interact with the augmentation and only consumes the stream?
If the scene is not important you could send the position of the augmentation(s) to the client. This would be also very low bandwidth and fast. Precondition is that Device B also has the augmentation.
Currently we unfortunately have no possibility to capture a video of the full AR scene comparable to our captureScreen() function. For streaming the video of the scene other solutions should be available.
OK, then the described solution would be our approach.
Video capturing is on our roadmap, but we have no specific timeline for this feature yet.
I'm also looking for a way to stream AR enriched video frames using the Android native SDK.
Could you maybe point us in the right direction to be able to extract frames (preferable in NV21 format) from the GLSurfaceView?
I've been looking at the Grafika examples (ContinuousCaptureActivity and RecordFBOActivity) but these always end up recording video (mp4) instead of providing actual frames.
Same with RecordableSurfaceView (https://github.com/UncorkedStudios/recordablesurfaceview)
Any hints would be greatly appreciated!
you can get the frame data with a c++ plugin. Please take a look at the Barcode plugin in our example app.
I would recommend to use the SDK 8.0 beta when starting with a plugin since there were big changes to the plugins API from 7 to 8.
I know I can get the camera frames via a c++ plugin (that's what I have now), but I need the camera frames + anything drawn over it in the GLSurfaceView (cfr screensharing)
Currently I've been able to incorporate Grafika's CircularEncoder, now I just need to figure out how to get video frames from the buffer. It's disturbingly complex to accomplish such a seemingly simple task :)