Our app has ar as an additional feature, So we support both devices which can't use AR and devices that can. If the user with an unsupported device enters the AR scene. We want to show them that their device is not supported AR feature. Is there a way to achieve this ?
Could you tell me what classifies a device that can't use AR (for your purposes)?
The SDK throws errors and events if for instance the device doesn't support ARKit/ARCore, if camera permission was denied or there is no camera at all to retrieve data from.
If performance (in terms of frames per second) is a classifier for you, there is unfortunately no way to tell beforehand if a device is supported or not, because this heavily depends on how many trackers and targets there are in a scene and what kind of tracker (image, cylinder, object, ...).
Multiple object tracking is for instance very memory heavy. If you use it, you could measure the memory allocation on some devices and then you would have a decent indicator for how much memory your AR scene would need. This can be used for calculating if the device can run the AR scene for instance...
But summarized, we have a bunch of events and error callbacks if something that the SDK needs from a device isn't available.
Thank you, I'll try using the error callback.
Do you mean support for ARKit/ARCore or AR functionality in general (e.g. image tracking, etc.)?
Are you using Unity?
Yes, I'm using Unity. I meant to check for support of Wikitude's features(image tracking and object tracking).