We are working on a remote support project. Most of the features were completed with Wikitude succesfully.
Now, we have to use input plugin API to make "customer’s camera" as our "support technician source camera". We take a look at your documentation but we couldn’t success.
Our broadcasting video coming from RTMP server. We must make this as source camera for Augmentations.
Thank your for kind reponse. Most of all are ready.
Just i only need that, how can i use "YUVInputPlugin" to make a video streaming as a source of World?
I am using WebRTC to broadcast a video, i save the instant target objects. Everything is ok. But i need to know more details about YUTInputPlugin. Is it possible share a sample code block or a library?
Could you clarify what you refer to with the YUTInputPlugin? For our InputPlugin Feature, you can find a sample code in the JS or the Native SDK sample app. Additionally you can find details in the technical documentation (here the link to the Android JS SDK):
Here is an additional forum post dealing with WebRTC, which might help:
Thx and greetings
We are working on a remote support project and we need to use Input Plugin. As you know, it's necessary to broadcast a user screen to another user. But opposite side of the broadcast must use 1st user's screen as a camera source (World).
I saw that documentation but it is not so clear. Do we have to write a C++ plugin to get broadcasting to Wikitude source? I need more details. Thank you.
Yes you'd need to implement your own plugin using the plugins API feature. But as mentioned we don't have a specific sample or further code / documentation specifically for the remote AR use case (just the sample in the sample app that shows how the Input Plugins API can be used). If it helps, we can connect you to one of our Premium Partners working with Remote AR use cases that can assist you with the implementation of your app.
Thx and greetings