This isn't a question specifically about JPCT-AE and more about GLSurfaceView.
I have a 5x5 grid of tiles. If I touch one of the tiles on screen the camera re-adjusts so that it is centered on that tile. I am trying to animate this process so that it is a smooth transition rather than instant.
To do this I've set up two sets of co-ordinates:
1) The destination x, y, and z co-ords.
2) The current camera co-ordinates
I'm only looking at the x axis currently. In the onDrawFrame() method I do a check. If the current camera X position is not (roughly) equal to the destination X position, then I increment/decrement the current camera X co-ordinate by some small amount (0.1 units for example) to bring it closer to the destination.
My thinking is that it will continue to increment or decrement itself until it is equal to the destination X position, in which case we've reached our destination and no more animation is required.
My problem is:
- If I have the increment amount to something small (cameraX = cameraX + 0.001) then the animation is smooth but extremely SLOW.
- If I have the increment amount to something bigger (cameraX = cameraX + 1) then the animation is fast enough but extremely CHUNKY.
Is my approach valid? How do I implement smooth camera movement animation?
Any help would be much appreciated.
FYI my animation code is below. The modifier is either +1 or -1 depending if the destination is to the left or the right of the current position.
public void onDrawFrame(GL10 gl) {
if ( cameraX < destinationX-0.1 || cameraX > destinationX+0.1 )
{
Float xModifier = 1f;
if ( cameraX > destinationX )
{
xModifier = -1f;
}
cameraX = cameraX + (1f * xModifier);
}
camera.setPosition(cameraX,cameraY,cameraZ-10);
...
In my AVD I'm getting only 9 FPS on average. It's a Mac laptop (I'm away on holiday). Not sure if this is related to the chunkiness.