2
\$\begingroup\$

I am trying to use the touch screen on the tablet to control my camera movements and generate the view matrix accordingly.

I get the x and y coordinates of the screen and after doing some sanity checks on it I use them. My problem is I get the image and based on the movement it moves - but the object is always at a offset from the touch/pointer position. And I am not able to understand why. ( Kind of new at it)

Here is the camera logic - Where pX and pY are coordinates of the device screen.

pitch = pY * 0.05;
yaw =   pX * 0.05;

LOGD(">>pitch %f", pY);


pitch = min(pitch,90.0f);
pitch = max(pitch,-90.0f);

LOGD(">>pitch(alter) %f", pitch);

if(yaw < 0) {
    yaw += 360.0f;
}
if(yaw > 360.0f) {
    yaw -= 360.0f;
}

I am not sure whether I am going about it in the right direction. Any suggestions will be appreciated.

\$\endgroup\$
3
  • \$\begingroup\$ what is "the object"? the object you try to rotate or the given coordinates? \$\endgroup\$ Commented Feb 19, 2013 at 10:42
  • \$\begingroup\$ Its the object in the world. Right now I have a triangle. \$\endgroup\$ Commented Feb 19, 2013 at 12:02
  • \$\begingroup\$ maybe the coordinates you compare are not in the same space. maybe the swipe coordinates are in screenspace/cameraspace but the object coordinates are in worldspace. I don't really know how you could transform them into the same space though. \$\endgroup\$ Commented Feb 19, 2013 at 15:59

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.