[Tips] How to use the orientation sensors to set the JPCT camera to corrispond

Started by nimo, August 30, 2010, 08:07:27 PM

Previous topic - Next topic

nimo

Hello,
after a lot of trial & errors, I was able to use the orientation sensors to set the JPCT camera to correspond to the device movements.
This technique use Camera.setBack method, and it seems to work very well.
The solution comes from this interesting article: http://stackoverflow.com/questions/2881128/how-to-use-onsensorchanged-sensor-data-in-combination-with-opengl

The article is about how to use sensors directly with OpenGL API, so I had to adapt it in order to make it work with jPCT.

First of all, you must get the sensors' values:


public void onSensorChanged(SensorEvent event) {
final int type = event.sensor.getType();
if (type == Sensor.TYPE_ACCELEROMETER) {
accelGData = event.values.clone();
}
if (type == Sensor.TYPE_MAGNETIC_FIELD) {
magnetData = event.values.clone();
}
if (type == Sensor.TYPE_ORIENTATION) {
orientationData = event.values.clone();
}
rootMeanSquareBuffer(bufferedAccelGData, accelGData);
rootMeanSquareBuffer(bufferedMagnetData, magnetData);
SensorManager.getRotationMatrix(rotationMatrix, null,
bufferedAccelGData, bufferedMagnetData);
}


I omitted all the code needed to register the SensorManager listeners, because it isn't nothing new respect to the standard.
Note the use of the rootMeanSquareBuffer() function to smooth the device movement.
It's mandatory to obtain an 'stable' camera movement.

After that, in the onDrawFrame, you can use the rotationMatrix calculated in the previous step:


public void onDrawFrame(GL10 gl) {

Camera cam = world.getCamera();

if (landscape) {
// in landscape mode first remap the
// rotationMatrix before using
// it with camera.setBack:
float[] result = new float[9];
SensorManager.remapCoordinateSystem(
rotationMatrix, SensorManager.AXIS_MINUS_Y,
SensorManager.AXIS_MINUS_X, result);
com.threed.jpct.Matrix mResult = new com.threed.jpct.Matrix();
copyMatrix(result, mResult);
cam.setBack(mResult);
} else {
// WARNING: This solution doesn't work in portrait mode
// See the explanation below
}
// ... Draw here your own 3D world
fb.clear();
world.renderScene(fb);
world.draw(fb);
blitNumber(lfps, 5, 5);
fb.display();
}

private void copyMatrix(float[] src, com.threed.jpct.Matrix dest) {
dest.setRow(0, src[0], src[1], src[2], 0);    
dest.setRow(1, src[3], src[4], src[5], 0);    
dest.setRow(2, src[6], src[7], src[8], 0);    
dest.setRow(3, 0f, 0f, 0f, 1f);
}



As you can read in the code above, this solution doesn't work when the screen is set in portrait mode, so you must insert the following declaration in the manifest file:


       <activity android:name=".myActivity" ...
                    android:screenOrientation="landscape">
              ...
       </activity>


Who is able to find the solution working also in portrait mode is invited to participate to the discussion  ;)

Please note that the coordinate system will be that one returned by SensorManager.getRotationMatrix method, as described in the Android API documentation:
Quote

  • X is defined as the vector product Y.Z (It is tangential to the ground at the device's current location and roughly points East).
  • Y is tangential to the ground at the device's current location and points towards the magnetic North Pole.
  • Z points towards the sky and is perpendicular to the ground.
(See attached figure)
Take it in account when you move your objects in the scene.

Hope this will help someone  :D

Paolo

P.S.: please forgive the grammatical errors coming from an Italian mother tongue  :-\

[attachment deleted by admin]

EgonOlsen

I'm sure this will be helpful to some people, so i made it sticky. Thanks for sharing.

pritom057

Hi

I am using T-mobile....Speed is very slow.
what is the reason.
What mobile you people prefer for developing 3D application for android using this API??

lordpaolo

Thank you for this, it needs some polishing on my side but it still works well :)

32kda

I've tried solving this problem in bit simplier way, using accelerometer values only.
In activity class, onSensorChanged method:


   @Override
    public void onSensorChanged(SensorEvent event) {
        if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER)
            return;
        /*
         * record the accelerometer data, the event's timestamp as well as
         * the current time. The latter is needed so we can calculate the
         * "present" time during rendering. In this application, we need to
         * take into account how the screen is rotated with respect to the
         * sensors (which always return data in a coordinate space aligned
         * to with the screen in its native orientation).
         */

        switch (mDisplay.getRotation()) {
            case Surface.ROTATION_0:
                mSensorX = event.values[0];
                mSensorY = event.values[1];
                break;
            case Surface.ROTATION_90:
                mSensorX = -event.values[1];
                mSensorY = event.values[0];
                break;
            case Surface.ROTATION_180:
                mSensorX = -event.values[0];
                mSensorY = -event.values[1];
                break;
            case Surface.ROTATION_270:
                mSensorX = event.values[1];
                mSensorY = -event.values[0];
                break;
        }
        if (gameManager != null)
        gameManager.handleSensor(mSensorX, mSensorY);
    }

This method is mostly copied from Android sample ball game

gameManager is the instance of my game's class, which handles user input, game loop, creating game objects etc. gameManager.handleSensor(mSensorX, mSensorY) can be be replaced with your  method handling accelerometer values.

Then goes GameManager class:



final int TICKS_PER_SECOND = 50;
final int SKIP_TICKS = 1000 / TICKS_PER_SECOND;
/**
* Game loop thread
* @author 32kda
*/
protected class GameLoop extends Thread {


public GameLoop() {
super("GameLoop");
}

@Override
public void run() {
long ticks = System.currentTimeMillis();
while (!isPaused()) {
long current = System.currentTimeMillis();
if (current - ticks < SKIP_TICKS) {
try {
sleep(SKIP_TICKS - (current - ticks));
} catch (InterruptedException e) {
e.printStackTrace();
}
}
updateGame(System.currentTimeMillis() - ticks); //Call update game method with some time interval
ticks = current;
}
super.run();
}

public boolean isPaused () {
return false;
}
}

...

final SimpleVector initialVectorUp = new SimpleVector(0f,-1f,0f); //Initial vector pointing up
SimpleVector vectorUp = new SimpleVector(0f,-1f,0f); //Camera's up vector. We reinit & rotate it any time we change camera orientation to set camera up/down angle with it

private float sensorX; //Accelerometer X & Y values
private float sensorY;
private float initialY = Float.MIN_VALUE; //field for storing initial y accelerometer value
private SimpleVector cameraVector = new SimpleVector(0f,0f,1f);  //Camera horizontal orientation/azimuth vector
private SimpleVector orthoVector = new SimpleVector(); //Vector for rotating camera up/down. should be orthogonal to cameraVector when rotating
private Matrix cameraMatrix = new Matrix();  //Camera up vector rotation matrix
private float angle = 0; //Camera up/down rotation angle

/**
*handleSensor method. Simply remember sensorX / sensorY val for future use
*/
public void handleSensor(float mSensorX, float mSensorY) {
this.sensorX = mSensorX;
this.sensorY = mSensorY;

}
/**
*This method is called from game loop & used to update our world, camera, etc.
*/
protected void updateGame(long delta) {

if (Math.abs(sensorX) > 0.3) { //If x accelerometer value is greater than some threshold - rotate camera direction vector according to it
cameraVector.rotateY(-sensorX / 150);
}
Camera camera = world.getCamera();
if (initialY == Float.MIN_VALUE || initialY == 0.0) //Because it's not convenient to play with device placed strictly horizontally, I use some "initial" value
initialY = sensorY; //for vertical angle, and use it as position corresponding to zero vertical angle for camera
float yDiff = sensorY - initialY; //Calc the differnce
if (Math.abs(yDiff) > 0.3) {
angle -= yDiff / 200;
if (angle > 0.40f) //Limit vertical angle
angle = 0.40f;
if (angle < -0.40f)
angle = -0.40f;
}
vectorUp.set(initialVectorUp); //Re-init camera up-vector before rotating it to necessary angle
orthoVector.x = cameraVector.z; //Calculate vector orthogonal to camera vector (camera up/down riotation vector)
orthoVector.z = -cameraVector.x;
calcRotMatrix(cameraMatrix,orthoVector,angle); //Calculate rotation matrix
vectorUp.rotate(cameraMatrix); //rotate camera up vector
camera.setOrientation(cameraVector,vectorUp); //Set camera orientation
}

/**
* Calculates rotation matrix for arbitrary axis
* @param matrix Matrix to fill
* @param rotVector Arbitrary rotation axis vector
* @param angle Rotation angle
         * Calculation algorythm got from the Internet & it works. Can't explain in details how(
*/
protected void calcRotMatrix(Matrix matrix, SimpleVector rotVector, float angle) {
matrix.setColumn(3,0,0,0,1);
matrix.setRow(3,0,0,0,1);
double sin = Math.sin(angle);
double cos = Math.cos(angle);
double subcos = 1 - cos;
float x = rotVector.x;
float y = rotVector.y;
float z = rotVector.z;

matrix.set(0,0,(float) (cos + subcos*x*x));
matrix.set(0,1,(float) (subcos*x*y - sin*z));
matrix.set(0,2,(float) (subcos*x*z + sin*y));

matrix.set(1,0,(float) (subcos*y*x + sin*z));
matrix.set(1,1,(float) (cos + subcos*y*y));
matrix.set(1,2,(float) (subcos*y*z - sin*x));

matrix.set(2,0,(float) (subcos*z*x - sin*y));
matrix.set(2,1,(float) (subcos*z*y + sin*x));
matrix.set(2,2,(float) (cos + subcos*z*z));
}

MrYogi

hi nimo,
i followed your steps for senser value passing but final result is just a black screen,is it due to other cam parameters?
can you share your rendering code as i am new to jpct and it will be a great help.

behelit

I would highly recommend using Sensor.TYPE_ROTATION_VECTOR instead of the others. It automatically combines the sensors and applies low pass filtering, producing a smoother result similar to the Google camera's photosphere, although it may depend on sdk version used (API 9+).
It also combines gyro data and handles devices without gyroscopes by falling back to accelerometer only.

You can also do portrait mode by applying the appropriate remapping, I don't remember the exact mapping but can test it if people actually need it.





Fixta

Hi Guys!

First things first, thank you for JPCT-Blend which I find very handful, and this forum.
In the frame of a master thesis, I'm currently working on a VR prototype.
For now, everything works fine and the way I want it to, except that the sample scene appears like it has been
rotated -90° on the Y axis in the Blender coordinates system.
I tried to remap the Sensor coordinate system to no avail.

I also tried the solution given by Paulo in this forum without success.
One solution I see would be to multiply the JPCT-View Matrix by the proper rotation matrix (i.e, +90° on the Z axis if I'm correct).
My question is: How could I do that in a clean way ?

Thanks for your help!
François-Xavier.

PS: I'm using this excellent work from Alexander Pacha to get the sensors fusion of the android device. It works incredibly well for VR:
https://bitbucket.org/apacha/sensor-fusion-demo




EgonOlsen

I'm not sure, what exactly you mean. You actually want to rotate the camera 90° around it's (maybe transformed) z-axis? If so, you could do something like


camera.rotateAxis(camera.getZAxis(), ...);


Or isn't that what you meant?

Fixta

Thank you so much for your quick answer!
I can't believe I spent hours on this one :)
I was doing a
camera.rotateZAxis(new simpleVector(0.0f, 0.0f, 1.0f), Math.PI/2) instead of
camera.rotateAxis(camera.getZAxis(), Math.PI/2)!

This is why the rotations were not good anymore after rotating the axis.

Everything is fine now, thanks a lot!
Bravo and thank you for JPCT-Blend :)