Start a new topic

hands free input control and Wikitude SDK

hands free input control and Wikitude SDK


Hi guys,

I came across the forum due to the Wikitude SDK that works with several smart glasses. Right now we are developing a software that enables hands free interaction with Augmented and Virtual Reality devices using the built-in camera.

We are looking for games and applications which might profit from a hands free input control. Furthermore, we want to learn more about the requirments of hands free input control from a developer´s perspective.

Please have a look at the following video: https://youtu.be/unKJrff5lTA  (technology in action from 0:35).

From your point of view what are the crucial requirements for hands free input control?

I am looking forward to your feedback. Thanks a lot.

Best,

Desiree

Hello Desiree,

There are a number of possible ways to interact handfree with the AR scene.

* Use voice commands 
* Gesture detection (as in your sample)
* Trigger actions with a virtual "auto-selection crosshair" in your field of view

to name a few.

Requirements for handsfree interaction highly depend on the usecase. E.g. when you try to repair something with AR assistance voice cammands suit better than gestures but when your hands are free anyway you may use smart wearables (smart ring, myo...) to keep interaction complexity low and avoid making gestures inside the field of view of your camera.

Note: You can merge your technology with the Wikitude SDK very soon so one can use Wikitude's SDK features and additionally use gesture recognition. Upcoming Wikitude SDK 5.0 comes with a plugin concept, so you can access camera frames and even draw custom objects/information on top of it.
 

Best regards,
Andreas

Hello Andreas,

 

thanks for your input. Does the up-coming version support access to the camera frame as an OpenGL ES texture? What is the camera frame-rate used by Wikitude?

 

Best,

Desiree
Login or Signup to post a comment