I have an issue that the app needs to play different AR elements based on the updated targets or updated offers. Now the understanding on which the team is giving me feedback is that the app would be needed to be updated every time to play the new element. They are suggesting to go for WebAR solutions which are not available with wikitude.
Can someone guide me how to solve that problem with Wikitude.
You generally have different options here:
- Work with our Cloud Recognition feature. This is suitable if you have several 1000 images you want to recognize and e.g. embed the functionality in your own CMS for easier integration in the workflow if the content changes often.
- Work with Client Recognition and store the content on your servers. You can store the complete AR experience or just files on a server and reference it from the app. E.g. when you store the complete AR experience on your servers, then you simply can load it when you start the AR experience. Details on how to reference the experience in the architectView.load() can be found in the according documentation.
I hope this helps. Greetings
Thanks for getting back on this. Yes the cloud service can enable multiple images detection.
Its the second part of the feedback which is more likely a solution for us.
The problem is that for new images that can be hosted on our own servers, we want to have new animations (like different messages, Call to action etc). For that what can be done that the user does not have to update the app. Can they be downloaded and executed at runtime seamlessly.
Looking forwards. Im sure there must be a solution which we are not looking at as this issue will be quite common for CMS services especially in retail.
As mentioned you can store the complete AR experience (so including the animations / augmentations on your servers). I'm assuming you work with the JS SDK (or a JS SDK based extension like Cordova, Xamarin), so you'd put the .html, .js, .css files and assets on the server. If you then enter the url to the index.html file in the architectView.load(), then the experience is loaded from there. So yes -they can be downloaded and executed at runtime.
Thx and greetings
Thanks fo rthe answer. It does guide to what I was looking for. I will ask the team to look into this direction and this should ideally help us. If you have any running example that can be shared it will be great otherwise I am sure we can figure something out from the info provided.
I have created the complete AR project in Unity using wikitude SDK. However the physics and particles attached with the project being used to create effects cannot be exported to JS SDK. In my understanding, Unity effects will only work with native SDK (by exporting onto iphone or android only) and the only JS SDK can be run from the remote servers.
Can you kindly help us out how to get around this problem of exporting effects with our project and able to use it remotely.
In general you would either work with the JS SDK or the Unity plugin. If you work with Unity you'd stay in the Unity world. If you wish to load e.g. the TargetCollection or Trackers during runtime, when you work with Unity, please have a look at our RunTime Trackers.
I hope this helps. Greetings
Thanks for getting back on this.Suppose we want to stay in the unity world. Can we still load the objects and animations during run time. Ive been slightly confused as to thinking it is only possible in JS SDK. Can you help clarify the misunderstanding.
If I understand the problem correctly, you would like to build the app once and then change the targets and the augmentations dynamically, without re-deploying the app. Is that correct?
To solve the first part, the most convenient way is to use Cloud Recognition. However, this does require that the app is always online to work. If that is not desired, there are other methods we can discuss.
To be able to change or add new augmentations, you'll need to use Asset Bundles. Please check the documentation, as it is quite an extensive topic, but the gist of it is that you would query your server to get the latest augmentations, download them if necessary and then configure the ImageTrackable in Unity to use prefabs from these Asset Bundles.
Please let me know if you have any other questions.
Yes this is correct.
Our project is in Unity with animations and effects etc to be used in AR. We want to use the object with animations etc in our app and want to download it remotely from our servers at run time when online without having to bundle it with the app. This way different objects can be deployed without increasing size of app and can be changed from backend server as per the offering.
Our issue is that the image tracking is ok. Our project needs to be downloaded and run from the server. How can we export the project and use it remotely.
Hope it clears the confusion.
We are still stuck in the situation with no solid resolution. We have build our project in unity 3d with animated actions. The track able images recognition is not an issue.
The issue is to download the assets/animations on run time from server and execute on client. Whilst the solution being answered refers to only image trackable assets.
If our unity code cannot be downloaded and deployed then we are a t a dead end and need to look for another solution library.
Have you had a look at the solution with Asset Bundles that I previously mentioned? They allow you to download assets and animations from your server at runtime. Did you already try it and had some issues with it?
Can we please set up a small call and discuss in detiail with team present and the write the solution here and also present the problem which might not be understood here correctly.
I just wrote you an email regarding the requested call, but wanted to give a quick answer here as well:
As mentioned, according to our support policy and to being able to provide support to thousands of customers, we unfortunately can't accommodate your request for a call, as we need to plan our resources very well.
Thx for your understanding.
ok.. Our issue is that we have an app already published in which we want to use the AR module for unity games/animations and campaigns as a part of the overall app.
We have exported from Unity to android. Now this module that we have exported, we want it to be downloaded and executed on run time. Not to be built and shipped as apart of the app as more campaigns will increase the size.
as in JS API, we call the index file from server and is called in architect view. Similarly we want to the completed exported module to run in run time as well. The current methodology mentioned refers to only the assets to be downloaded for the given project.
For any additional campaign, we would have to give another update if only assets are called on run time not the entire project.