In my current project, I place multiple Models and ImageDrawables in the 'cam' of an ImageTrackable. Users can drag the Models over the ImageTrackable, just as explained in your "Gestures"-documentation.
This works quite fine. However, I do have some problems tracking the 'actual' position of the finger on the ImageTrackable. Since the xNormalized and yNormalized are relative, I need to translate them to the actual position on the ImageTrackable, which leads to unintuitive behaviour: the Model travels too quick or too slow i.r.t. the finger.
I would like to have an 'mouseOver' event of the ImageDrawables, that is triggered when the finger is going over them, while dragging a Model. In that case I can create a 'drop' location to drop the Model at the position of this ImageDrawable as the Model is triggering an onDragEnded at that moment.
This isn't possible at the moment, is it? Or is there a simple formula to exactly translate the xNormalized and yNormalized to the position on the ImageTrackable?
Jasper Soetendal
In my current project, I place multiple Models and ImageDrawables in the 'cam' of an ImageTrackable. Users can drag the Models over the ImageTrackable, just as explained in your "Gestures"-documentation.
This works quite fine. However, I do have some problems tracking the 'actual' position of the finger on the ImageTrackable. Since the xNormalized and yNormalized are relative, I need to translate them to the actual position on the ImageTrackable, which leads to unintuitive behaviour: the Model travels too quick or too slow i.r.t. the finger.
I would like to have an 'mouseOver' event of the ImageDrawables, that is triggered when the finger is going over them, while dragging a Model. In that case I can create a 'drop' location to drop the Model at the position of this ImageDrawable as the Model is triggering an onDragEnded at that moment.
This isn't possible at the moment, is it? Or is there a simple formula to exactly translate the xNormalized and yNormalized to the position on the ImageTrackable?