jPCT and real 3D

Started by guillaume, September 16, 2012, 07:02:25 AM

Previous topic - Next topic

guillaume

Dear Egon,
     some android based smart TV sets support output two different
3d images  (one for left eye and one for right eye ) to support real 3D.
I wonder how to do this with jPCT-AE ?
does jPCT support two cameras  in a World ?
so with these two cameras (one for left eye, one for right eye), I can
render scene for different eyes ?
can I instantiate two FrameBuffers in a same GLSurfaceView (each one occupies half of the surface view)?
and render the World (with different camera) to each of them ?

EgonOlsen

I was under the impression that some magic in the 3d drivers does this without any need to change the application somehow... ???

kiffa

This question is also what i am interested in!

And this may depend on the way how the hardware implement real-3D.

guillaume

Quote from: EgonOlsen on September 16, 2012, 08:48:58 AM
I was under the impression that some magic in the 3d drivers does this without any need to change the application somehow... ???
yes. there it is.  but is it possible for us to do it in app level ?

EgonOlsen

You can render the scene multiple times with different cameras...if that does any good.

guillaume

Quote from: EgonOlsen on September 21, 2012, 05:05:12 PM
You can render the scene multiple times with different cameras...if that does any good.
actually, after setup the world, I want to clone the world's camera
to two new cameras,  but there is no clone method for camera ,
how should I do it ?thanks

EgonOlsen

Like so:


Camera cam2=new Camera();
cam2.setFOV(cam.getFOV());
cam2.setPosition(cam.getPosition());
cam2.setBack(cam.getBack().cloneMatrix());

guillaume

#7
finally, I got my real 3D render.
the base idea is :
1.  create 1 offscreen framebuffer for each eye
2.  set the  framebuffer to render to a Texture
3.  render one eye's view to the texture
4.  blit the texture to screen with down scale to  1/2  original size.

then problem comes:
1. after several frames, OOM exception thrown.
many logs like:
I/jPCT-AE (  574): OpenGL context has changed...trying to recover!
I/jPCT-AE (  574): OpenGL context has changed...trying to recover!
I/jPCT-AE (  574): OpenGL context has changed...trying to recover!
I/jPCT-AE (  574): Additional visibility list (380) created with size: 512
I/jPCT-AE (  574): OpenGL context has changed...trying to recover!
I/jPCT-AE (  574): OpenGL context has changed...trying to recover!
I/jPCT-AE (  574): OpenGL context has changed...trying to recover!
I/jPCT-AE (  574): Additional visibility list (390) created with size: 512

2. the blit down-scaled image is upside-down.

dear Egon, can you give some advices ? thanks.


import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;

import com.threed.jpct.Camera;
import com.threed.jpct.FrameBuffer;
import com.threed.jpct.Logger;
import com.threed.jpct.RGBColor;
import com.threed.jpct.Texture;
import com.tpv.ui.trid.AndroidFrameBlitter;
import com.tpv.ui.trid.TriDScene;
import com.tpv.ui.trid.real3d.Real3DRenderer.Position;

import android.opengl.GLSurfaceView.Renderer;
import android.util.Log;

public class Real3DRenderer implements Renderer {


private TriDScene mScene = null;
private FrameBuffer mFb = null;
private FrameBuffer mBackFb = null;
private Texture mBackTexture = null;
public enum Position {
LEFT,
RIGHT,
};
private Position mPos = Position.LEFT;
private float mSpeed = 0.0f;
private final float SPEED_OFFSET = 0.01f;

public void incAngle() {
// TODO Auto-generated method stub
mSpeed += SPEED_OFFSET;
if(mPos == Position.LEFT)
Log.d("3DUI","left camera speed"+mSpeed);
else
Log.d("3DUI","right camera speed"+mSpeed);
}
public void decAngle() {
// TODO Auto-generated method stub
mSpeed -= SPEED_OFFSET;
if(mSpeed < 0)
mSpeed = 0.0f;
if(mPos == Position.LEFT)
Log.d("3DUI","left camera speed "+mSpeed);
else
Log.d("3DUI","right camera speed "+mSpeed);

}

public Real3DRenderer(TriDScene scene, Position pos){
mScene = scene;
mPos = pos;
}
private RGBColor black = new RGBColor(0, 0, 0, 0);
@Override
public void onDrawFrame(GL10 gl) {
// TODO Auto-generated method stub

mBackFb.clear(black);
if(mPos == Position.LEFT)
mScene.renderFrame(mBackFb, -mSpeed);
else
mScene.renderFrame(mBackFb, mSpeed);
mBackFb.display();

// blit to screen
mFb.clear(black);
mFb.blit(mBackTexture, 0, 0, 0, 0,
mBackTexture.getWidth(), mBackTexture.getHeight(),
mBackTexture.getWidth()/2, mBackTexture.getHeight(),
-1,false, null);
mFb.display();


}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
// TODO Auto-generated method stub
if (mFb != null) {
mFb.dispose();
}
mFb = new FrameBuffer(gl, width, height);
Logger.log("==== Launcher launch: " +width+"X" +height+" OpenGL Major Version: "+mFb.getOpenGLMajorVersion());

if(mBackFb != null){
mBackFb.dispose();
}
mBackFb = new FrameBuffer(gl, width*2, height);
mBackTexture = new Texture(width*2, height);
mBackFb.setRenderTarget(mBackTexture);
}

@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
// TODO Auto-generated method stub

}

}

EgonOlsen

You are not supposed to use multiple FrameBuffer instances at a time. This makes jPCT-AE think that the context has changed each time you render into it, which is why it behaves the way it does. You should be able to rewrite the code to use one buffer instead. If you assign a render target, the actual buffer size will be overriden by the size of the texture.

EgonOlsen

#9
About things being upside down: That's caused by the way gls coordinate system works. You can try to blit with a negative height...you should be able to find something about this somewhere around here.

Edit: Like so:


fb.blit(renderTarget, 0, 0, 0, fb.getHeight(), fb.getWidth(), fb.getHeight(), fb.getWidth(), -fb.getHeight(), -1, false, null);

guillaume

Quote from: EgonOlsen on October 09, 2012, 06:27:39 PM
You are not supposed to use multiple FrameBuffer instances at a time. This makes jPCT-AE think that the context has changed each time you render into it, which is why it behaves the way it does. You should be able to rewrite the code to use one buffer instead. If you assign a render target, the actual buffer size will be overriden by the size of the texture.

currently I render left eye's and right eye's scene to screen side by side. to achieve this,
I use two GLSurfaceView on the screen with two FrameBuffer.
by using ONE  FrameBuffer, How can I render side by side scene on screen ?
and before render to the screen, I would like to down scale the rendered image to its 1/2 width.

EgonOlsen

Does real 3d require to use two GLSurfaceView instances? Or would it work with one that contains both parts of the image?

guillaume

Quote from: EgonOlsen on October 10, 2012, 08:16:54 AM
Does real 3d require to use two GLSurfaceView instances? Or would it work with one that contains both parts of the image?
No it doesn't.
we just require that the image contains left,right eye's view side by side.

EgonOlsen

Well then...create one framebuffer of the output size, create one (or two) texture(s) to render the parts into, set them as render targets one after the other, render the parts in each one, remove the render target and blit both textures into the framebuffer. You might have to enable OpenGL ES 2.0 for proper render to texture btw. (if you haven't already).

guillaume

thanks, egon.
with FrameBuffer.blit 's argument  destWidth set to mFb.getWidth()/2 , I got the device-2012-10-11-111842-half.png
with destWidth set to  mFb.getWidth(), I got  the device-2012-10-11-111934-full.png.
my goal is to show the full content in left side of the screen.

the half width image dose not show the full content of the full width image, but lost some.
does this mean that the FrameBuffer.blit is broken in android ?


mFb.blit(mLeftTexture,
0, 0, 0, mFb.getHeight(),
mLeftTexture.getWidth(), mLeftTexture.getHeight(),
mFb.getWidth(), -mFb.getHeight(), -1, false, null);


[attachment deleted by admin]