Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - nimo

Pages: [1]
1
Support / Re: using sensor to rotate camera
« on: September 28, 2010, 12:10:36 am »
Hi,
here is the complete code described in my post (I just omitted the package declaration and the imports).
It's a merge between the JPCT Demo and the code found in the article I mentioned.
Because of the merging, you'll find a lot of not used code. When you'll clean it, keep in consideration that only MODUS=1, 2, 3 and 4 work (MODUS=2 is the preferred). Remember also to set the screen orientation to landscape, as described in my post.

How to use the code:
1 - Download and open in Eclipse the JPCT Demo (make it working before to continue)
2 - Substitute Demo.java with the code below
3 - Set the screen orientation = landscape in the manifest file
4 - Runnit on your device
5 - ...move around your phone and enjoy  :o

Code: [Select]
/**
 * A simple demo. This shows more how to use jPCT-AE than it shows how to write
 * a proper application for Android, because i have no idea how to do this. This
 * thing is more or less a hack to get you started...
 *
 * @author EgonOlsen
 *
 * Modified in order to move the camera according the sensors
 *
 */
public class Demo extends Activity implements SensorEventListener {

private static final boolean TRY_TRANSPOSED_VERSION = false;

private static int MODUS = 2;

private GLSurfaceView mGLView;
private MyRenderer renderer = null;
private FrameBuffer fb = null;
private World world = null;
private boolean paused = false;

private SensorManager mSensorManager;
private float[] rotationMatrix = new float[9];
private float[] accelGData = new float[3];
private float[] bufferedAccelGData = new float[3];
private float[] magnetData = new float[3];
private float[] bufferedMagnetData = new float[3];
private float[] orientationData = new float[3];
private float[] resultingAngles = new float[3];

private int mCount;

final static float rad2deg = (float) (180.0f / Math.PI);

private boolean landscape;

protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);

mGLView = new GLSurfaceView(getApplication());

mGLView.setEGLConfigChooser(new GLSurfaceView.EGLConfigChooser() {
public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display) {
// Ensure that we get a 16bit framebuffer. Otherwise, we'll fall
// back to Pixelflinger on some device (read: Samsung I7500)
int[] attributes = new int[] { EGL10.EGL_DEPTH_SIZE, 16,
EGL10.EGL_NONE };
EGLConfig[] configs = new EGLConfig[1];
int[] result = new int[1];
egl.eglChooseConfig(display, attributes, configs, 1, result);
return configs[0];
}
});

renderer = new MyRenderer();
mGLView.setRenderer(renderer);
setContentView(mGLView);
}

@Override
protected void onPause() {
paused = true;
super.onPause();
mGLView.onPause();
}

@Override
protected void onResume() {
paused = false;
super.onResume();
mGLView.onResume();

if (((WindowManager) getSystemService(WINDOW_SERVICE))
.getDefaultDisplay().getOrientation() == 1) {
landscape = true;
} else {
landscape = false;
}

mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
SensorManager.SENSOR_DELAY_GAME);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),
SensorManager.SENSOR_DELAY_GAME);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_ORIENTATION),
SensorManager.SENSOR_DELAY_GAME);
}

protected void onStop() {
renderer.stop();
super.onStop();
}

protected boolean isFullscreenOpaque() {
return true;
}

class MyRenderer implements GLSurfaceView.Renderer {

private Object3D plane = null;
private Object3D tree2 = null;
private Object3D tree1 = null;
private Object3D grass = null;
private Texture font = null;

private int fps = 0;
private int lfps = 0;

private long time = System.currentTimeMillis();

private Light sun = null;
private Object3D rock = null;

private boolean stop = false;

private float ind;

private boolean deSer = true;

public MyRenderer() {
Config.maxPolysVisible = 5000;
Config.farPlane = 1500;
}

public void stop() {
stop = true;
if (fb != null) {
fb.dispose();
fb = null;
}
}

public void onSurfaceChanged(GL10 gl, int w, int h) {
if (fb != null) {
fb.dispose();
}
fb = new FrameBuffer(gl, w, h);
}

public void onSurfaceCreated(GL10 gl, EGLConfig config) {
TextureManager.getInstance().flush();
world = new World();
Resources res = getResources();

TextureManager tm = TextureManager.getInstance();
Texture grass2 = new Texture(res.openRawResource(R.raw.grassy));
Texture leaves = new Texture(res.openRawResource(R.raw.tree2y));
Texture leaves2 = new Texture(res.openRawResource(R.raw.tree3y));
Texture rocky = new Texture(res.openRawResource(R.raw.rocky));

Texture planetex = new Texture(res.openRawResource(R.raw.planetex));

font = new Texture(res.openRawResource(R.raw.numbers));

tm.addTexture("grass2", grass2);
tm.addTexture("leaves", leaves);
tm.addTexture("leaves2", leaves2);
tm.addTexture("rock", rocky);
tm.addTexture("grassy", planetex);

if (!deSer) {
// Use the normal loaders...
plane = Primitives.getPlane(20, 30);
grass = Loader.load3DS(res.openRawResource(R.raw.grass), 5)[0];
rock = Loader.load3DS(res.openRawResource(R.raw.rock), 15f)[0];
tree1 = Loader.load3DS(res.openRawResource(R.raw.tree2), 5)[0];
tree2 = Loader.load3DS(res.openRawResource(R.raw.tree3), 5)[0];

plane.setTexture("grassy");
rock.setTexture("rock");
grass.setTexture("grass2");
tree1.setTexture("leaves");
tree2.setTexture("leaves2");

plane.getMesh().setVertexController(new Mod(), false);
plane.getMesh().applyVertexController();
plane.getMesh().removeVertexController();
} else {
// Load the serialized version instead...
plane = Loader.loadSerializedObject(res
.openRawResource(R.raw.serplane));
rock = Loader.loadSerializedObject(res
.openRawResource(R.raw.serrock));
tree1 = Loader.loadSerializedObject(res
.openRawResource(R.raw.sertree1));
tree2 = Loader.loadSerializedObject(res
.openRawResource(R.raw.sertree2));
grass = Loader.loadSerializedObject(res
.openRawResource(R.raw.sergrass));
}

grass.translate(-45, -17, -50);
grass.rotateZ((float) Math.PI);
rock.translate(0, 0, -90);
rock.rotateX(-(float) Math.PI / 2);
tree1.translate(-50, -92, -50);
tree1.rotateZ((float) Math.PI);
tree2.translate(60, -95, 10);
tree2.rotateZ((float) Math.PI);
plane.rotateX((float) Math.PI / 2f);

plane.setName("plane");
tree1.setName("tree1");
tree2.setName("tree2");
grass.setName("grass");
rock.setName("rock");

world.addObject(plane);
world.addObject(tree1);
world.addObject(tree2);
world.addObject(grass);
world.addObject(rock);

RGBColor dark = new RGBColor(100, 100, 100);

grass.setTransparency(10);
tree1.setTransparency(0);
tree2.setTransparency(0);

tree1.setAdditionalColor(dark);
tree2.setAdditionalColor(dark);
grass.setAdditionalColor(dark);

world.setAmbientLight(200, 200, 200);
world.buildAllObjects();

sun = new Light(world);

Camera cam = world.getCamera();
cam.moveCamera(Camera.CAMERA_MOVEOUT, 250);
cam.moveCamera(Camera.CAMERA_MOVEUP, 100);
cam.lookAt(plane.getTransformedCenter());

cam.setFOV(1.5f);
sun.setIntensity(250, 250, 250);

SimpleVector sv = new SimpleVector();
sv.set(plane.getTransformedCenter());
sv.y -= 300;
sv.x -= 100;
sv.z += 200;
sun.setPosition(sv);
}

private void copyMatrix(float[] src, com.threed.jpct.Matrix dest,
boolean traspose) {
if (!traspose) {
dest.setRow(0, src[0], src[1], src[2], 0);   
dest.setRow(1, src[3], src[4], src[5], 0);   
dest.setRow(2, src[6], src[7], src[8], 0);   
dest.setRow(3, 0f, 0f, 0f, 1f);
} else {
dest.setRow(0, src[0], src[3], src[6], 0);
dest.setRow(1, src[1], src[4], src[7], 0);
dest.setRow(2, src[2], src[5], src[8], 0);
dest.setRow(3, 0f, 0f, 0f, 1f);

}
}

public void onDrawFrame(GL10 gl) {

try {
if (!stop) {
if (paused) {
Thread.sleep(500);
} else {
Camera cam = world.getCamera();

if ((MODUS == 1) || (MODUS == 2) || (MODUS == 3)
|| (MODUS == 4)) {
boolean traspose = false;
if (landscape) {
// in landscape mode first remap the
// rotationMatrix before using
// it with glMultMatrixf:
float[] result = new float[9];
SensorManager.remapCoordinateSystem(
rotationMatrix, SensorManager.AXIS_MINUS_Y,
SensorManager.AXIS_MINUS_X, result);
com.threed.jpct.Matrix mResult = new com.threed.jpct.Matrix();
copyMatrix(result, mResult, traspose);
//mResult.rotateZ((float)Math.PI / 2);
cam.setBack(mResult);
// gl.glMultMatrixf(result, 0);
} else {
com.threed.jpct.Matrix mResult = new com.threed.jpct.Matrix();
copyMatrix(rotationMatrix, mResult, traspose);
cam.setBack(mResult);
}
} else {
// in all other modes do the rotation by hand
// the order y x z is important!
// gl.glRotatef(resultingAngles[2], 0, 1, 0);
// gl.glRotatef(resultingAngles[1], 1, 0, 0);
// gl.glRotatef(resultingAngles[0], 0, 0, 1);
cam.getBack().setIdentity();
cam.rotateCameraX((float) Math.PI / 2);

cam.rotateCameraY(resultingAngles[0]);
cam.rotateCameraX(resultingAngles[2]);
cam.rotateCameraZ(resultingAngles[1]);
}

// move the axis to simulate augmented behaviour:
// gl.glTranslatef(0, 2, 0);

fb.clear();
world.renderScene(fb);
world.draw(fb);
blitNumber(lfps, 5, 5);

fb.display();

sun.rotate(new SimpleVector(0, 0.05f, 0),
plane.getTransformedCenter());

if (System.currentTimeMillis() - time >= 1000) {
lfps = (fps + lfps) >> 1;
fps = 0;
time = System.currentTimeMillis();
}
fps++;
ind += 0.02f;
if (ind > 1) {
ind -= 1;
}
}
} else {
if (fb != null) {
fb.dispose();
fb = null;
}
}
} catch (Exception e) {
Logger.log("Drawing thread terminated!", Logger.MESSAGE);
}
}

private class Mod extends GenericVertexController {
private static final long serialVersionUID = 1L;

public void apply() {
SimpleVector[] s = getSourceMesh();
SimpleVector[] d = getDestinationMesh();
for (int i = 0; i < s.length; i++) {
d[i].z = s[i].z
- (10f * (FloatMath.sin(s[i].x / 50f) + FloatMath
.cos(s[i].y / 50f)));
d[i].x = s[i].x;
d[i].y = s[i].y;
}
}
}

private void blitNumber(int number, int x, int y) {
if (font != null) {
String sNum = Integer.toString(number);
for (int i = 0; i < sNum.length(); i++) {
char cNum = sNum.charAt(i);
int iNum = cNum - 48;
fb.blit(font, iNum * 5, 0, x, y, 5, 9,
FrameBuffer.TRANSPARENT_BLITTING);
x += 5;
}
}
}
}

public void onAccuracyChanged(Sensor sensor, int accuracy) {
}

public void onSensorChanged(SensorEvent event) {

// load the new values:
loadNewSensorData(event);

if (MODUS == 1) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
}

if (MODUS == 2) {
rootMeanSquareBuffer(bufferedAccelGData, accelGData);
rootMeanSquareBuffer(bufferedMagnetData, magnetData);
SensorManager.getRotationMatrix(rotationMatrix, null,
bufferedAccelGData, bufferedMagnetData);
}

if (MODUS == 3) {
rootMeanSquareBuffer(bufferedMagnetData, magnetData);
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
bufferedMagnetData);
}

if (MODUS == 4) {
rootMeanSquareBuffer(bufferedAccelGData, accelGData);
SensorManager.getRotationMatrix(rotationMatrix, null,
bufferedAccelGData, magnetData);
}

if (MODUS == 5) {
// this mode uses the sensor data recieved from the orientation
// sensor
resultingAngles = orientationData.clone();
if ((-90 > resultingAngles[1]) || (resultingAngles[1] > 90)) {
resultingAngles[1] = orientationData[0];
resultingAngles[2] = orientationData[1];
resultingAngles[0] = orientationData[2];
}
}

if (MODUS == 6) {
rootMeanSquareBuffer(bufferedAccelGData, accelGData);
rootMeanSquareBuffer(bufferedMagnetData, magnetData);
SensorManager.getRotationMatrix(rotationMatrix, null,
bufferedAccelGData, bufferedMagnetData);
final float[] anglesInRadians = new float[3];
SensorManager.getOrientation(rotationMatrix, anglesInRadians);
// TODO check for landscape mode
resultingAngles[0] = anglesInRadians[0]; // * rad2deg;
resultingAngles[1] = anglesInRadians[1]; // * rad2deg;
resultingAngles[2] = anglesInRadians[2]; // * -rad2deg;
}

if (MODUS == 7) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);

rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in x y z
* order Rx*Ry*Rz
*/

resultingAngles[2] = (float) (Math.asin(rotationMatrix[2]));
final float cosB = (float) Math.cos(resultingAngles[2]);
resultingAngles[2] = resultingAngles[2] * rad2deg;
resultingAngles[0] = -(float) (Math.acos(rotationMatrix[0] / cosB))
* rad2deg;
resultingAngles[1] = (float) (Math.acos(rotationMatrix[10] / cosB))
* rad2deg;
}

if (MODUS == 8) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in z y x
*/

resultingAngles[2] = (float) (Math.asin(-rotationMatrix[8]));
final float cosB = (float) Math.cos(resultingAngles[2]);
resultingAngles[2] = resultingAngles[2] * rad2deg;
resultingAngles[1] = (float) (Math.acos(rotationMatrix[9] / cosB))
* rad2deg;
resultingAngles[0] = (float) (Math.asin(rotationMatrix[4] / cosB))
* rad2deg;
}

if (MODUS == 9) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in z x y
*
* note z axis looks good at this one
*/

resultingAngles[1] = (float) (Math.asin(rotationMatrix[9]));
final float minusCosA = -(float) Math.cos(resultingAngles[1]);
resultingAngles[1] = resultingAngles[1] * rad2deg;
resultingAngles[2] = (float) (Math.asin(rotationMatrix[8]
/ minusCosA))
* rad2deg;
resultingAngles[0] = (float) (Math.asin(rotationMatrix[1]
/ minusCosA))
* rad2deg;
}

if (MODUS == 10) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in y x z
*/

resultingAngles[1] = (float) (Math.asin(-rotationMatrix[6]));
final float cosA = (float) Math.cos(resultingAngles[1]);
resultingAngles[1] = resultingAngles[1] * rad2deg;
resultingAngles[2] = (float) (Math.asin(rotationMatrix[2] / cosA))
* rad2deg;
resultingAngles[0] = (float) (Math.acos(rotationMatrix[5] / cosA))
* rad2deg;
}

if (MODUS == 11) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in y z x
*/

resultingAngles[0] = (float) (Math.asin(rotationMatrix[4]));
final float cosC = (float) Math.cos(resultingAngles[0]);
resultingAngles[0] = resultingAngles[0] * rad2deg;
resultingAngles[2] = (float) (Math.acos(rotationMatrix[0] / cosC))
* rad2deg;
resultingAngles[1] = (float) (Math.acos(rotationMatrix[5] / cosC))
* rad2deg;
}

if (MODUS == 12) {
SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
magnetData);
rotationMatrix = transpose(rotationMatrix);
/*
* this assumes that the rotation matrices are multiplied in x z y
*/

resultingAngles[0] = (float) (Math.asin(-rotationMatrix[1]));
final float cosC = (float) Math.cos(resultingAngles[0]);
resultingAngles[0] = resultingAngles[0] * rad2deg;
resultingAngles[2] = (float) (Math.acos(rotationMatrix[0] / cosC))
* rad2deg;
resultingAngles[1] = (float) (Math.acos(rotationMatrix[5] / cosC))
* rad2deg;
}
logOutput();
}

/**
* transposes the matrix because it was transposted (inverted, but here its
* the same, because its a rotation matrix) to be used for opengl
*
* @param source
* @return
*/
private float[] transpose(float[] source) {
final float[] result = source.clone();
if (TRY_TRANSPOSED_VERSION) {
result[1] = source[4];
result[2] = source[8];
result[4] = source[1];
result[6] = source[9];
result[8] = source[2];
result[9] = source[6];
}
// the other values in the matrix are not relevant for rotations
return result;
}

private void rootMeanSquareBuffer(float[] target, float[] values) {

final float amplification = 200.0f;
float buffer = 20.0f;

target[0] += amplification;
target[1] += amplification;
target[2] += amplification;
values[0] += amplification;
values[1] += amplification;
values[2] += amplification;

target[0] = (float) (Math
.sqrt((target[0] * target[0] * buffer + values[0] * values[0])
/ (1 + buffer)));
target[1] = (float) (Math
.sqrt((target[1] * target[1] * buffer + values[1] * values[1])
/ (1 + buffer)));
target[2] = (float) (Math
.sqrt((target[2] * target[2] * buffer + values[2] * values[2])
/ (1 + buffer)));

target[0] -= amplification;
target[1] -= amplification;
target[2] -= amplification;
values[0] -= amplification;
values[1] -= amplification;
values[2] -= amplification;
}

private void loadNewSensorData(SensorEvent event) {
final int type = event.sensor.getType();
if (type == Sensor.TYPE_ACCELEROMETER) {
accelGData = event.values.clone();
}
if (type == Sensor.TYPE_MAGNETIC_FIELD) {
magnetData = event.values.clone();
}
if (type == Sensor.TYPE_ORIENTATION) {
orientationData = event.values.clone();
}
}

private void logOutput() {
if (mCount++ > 30) {
mCount = 0;
Log.d("Compass", "yaw0: " + (int) (resultingAngles[0])
+ "  pitch1: " + (int) (resultingAngles[1]) + "  roll2: "
+ (int) (resultingAngles[2]));
}
}

}

2
Hello,
after a lot of trial & errors, I was able to use the orientation sensors to set the JPCT camera to correspond to the device movements.
This technique use Camera.setBack method, and it seems to work very well.
The solution comes from this interesting article: http://stackoverflow.com/questions/2881128/how-to-use-onsensorchanged-sensor-data-in-combination-with-opengl

The article is about how to use sensors directly with OpenGL API, so I had to adapt it in order to make it work with jPCT.

First of all, you must get the sensors' values:

Code: [Select]
public void onSensorChanged(SensorEvent event) {
final int type = event.sensor.getType();
if (type == Sensor.TYPE_ACCELEROMETER) {
accelGData = event.values.clone();
}
if (type == Sensor.TYPE_MAGNETIC_FIELD) {
magnetData = event.values.clone();
}
if (type == Sensor.TYPE_ORIENTATION) {
orientationData = event.values.clone();
}
rootMeanSquareBuffer(bufferedAccelGData, accelGData);
rootMeanSquareBuffer(bufferedMagnetData, magnetData);
SensorManager.getRotationMatrix(rotationMatrix, null,
bufferedAccelGData, bufferedMagnetData);
}


I omitted all the code needed to register the SensorManager listeners, because it isn't nothing new respect to the standard.
Note the use of the rootMeanSquareBuffer() function to smooth the device movement.
It's mandatory to obtain an 'stable' camera movement.

After that, in the onDrawFrame, you can use the rotationMatrix calculated in the previous step:

Code: [Select]
public void onDrawFrame(GL10 gl) {

Camera cam = world.getCamera();

if (landscape) {
// in landscape mode first remap the
// rotationMatrix before using
// it with camera.setBack:
float[] result = new float[9];
SensorManager.remapCoordinateSystem(
rotationMatrix, SensorManager.AXIS_MINUS_Y,
SensorManager.AXIS_MINUS_X, result);
com.threed.jpct.Matrix mResult = new com.threed.jpct.Matrix();
copyMatrix(result, mResult);
cam.setBack(mResult);
} else {
// WARNING: This solution doesn't work in portrait mode
// See the explanation below
}
// ... Draw here your own 3D world
fb.clear();
world.renderScene(fb);
world.draw(fb);
blitNumber(lfps, 5, 5);
fb.display();
}

private void copyMatrix(float[] src, com.threed.jpct.Matrix dest) {
dest.setRow(0, src[0], src[1], src[2], 0);    
dest.setRow(1, src[3], src[4], src[5], 0);    
dest.setRow(2, src[6], src[7], src[8], 0);    
dest.setRow(3, 0f, 0f, 0f, 1f);
}


As you can read in the code above, this solution doesn't work when the screen is set in portrait mode, so you must insert the following declaration in the manifest file:

Code: [Select]
       <activity android:name=".myActivity" ...
                     android:screenOrientation="landscape">
               ...
        </activity>

Who is able to find the solution working also in portrait mode is invited to participate to the discussion  ;)

Please note that the coordinate system will be that one returned by SensorManager.getRotationMatrix method, as described in the Android API documentation:
Quote
  • X is defined as the vector product Y.Z (It is tangential to the ground at the device's current location and roughly points East).
  • Y is tangential to the ground at the device's current location and points towards the magnetic North Pole.
  • Z points towards the sky and is perpendicular to the ground.

(See attached figure)
Take it in account when you move your objects in the scene.

Hope this will help someone  :D

Paolo

P.S.: please forgive the grammatical errors coming from an Italian mother tongue  :-\

[attachment deleted by admin]

3
Bugs / Maybe a bug? Blank screen after Home button pressed
« on: August 23, 2010, 11:53:25 pm »
Hi,
first of all, thank you very much for this amazing library for Android.
I noticed a strange behaviour in my 3D application, and I noticed that the same problem is present in your demo downloaded from the site, so you'll be able to verify directly.
To reproduce the problem:

1 - Open the demo application: it works fine
2 - Press the 'Home' button to return to Android home screen
3 - Open again the demo application: now I see only a blank screen  ???

In my application, this problem occurs also when pressing the 'back' button.
Same behaviour both on my HTC Desire with Froyo 2.2 and on the Android emulator.

Thank you very much for your support
Paolo

P.S.: Sorry for my bad english (I'm Italian  ;D)


Pages: [1]