Load 3d model on top of Android camera preview?

Started by jaychang0917, February 10, 2017, 04:57:45 PM

Previous topic - Next topic

jaychang0917

Hi everyone, I have a project that need to achieve the following requirements:

  • Show android camera preview
  • Load a 3d model on top of it
  • Record the current display and save to a video file

And I tried to implement 1&2 with the following code, it throws an ArrayIndexOutOfBoundsException.


java.lang.ArrayIndexOutOfBoundsException: length=0; index=0
   at com.threed.jpct.CompiledInstance._fill(CompiledInstance.java:1206)
   at com.threed.jpct.CompiledInstance.fill(CompiledInstance.java:746)
   at com.threed.jpct.Object3DCompiler.compile(Object3DCompiler.java:148)
   at com.threed.jpct.World.compile(World.java:1951)
   at com.threed.jpct.World.renderScene(World.java:1046)
   at test.com.cameratest.MainRenderer.onDrawFrame(MainActivity.java:196)
   at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1522)
   at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1239)


Here is my code:

public class MainActivity extends Activity {
  private MainView mView;
  private PowerManager.WakeLock mWL;

  @Override
  public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    // full screen & full brightness
    requestWindowFeature(Window.FEATURE_NO_TITLE);
    getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
    getWindow().setFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON, WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
    mWL = ((PowerManager) getSystemService(Context.POWER_SERVICE)).newWakeLock(PowerManager.FULL_WAKE_LOCK, "WakeLock");
    mWL.acquire();
    mView = new MainView(this);
    setContentView(mView);
  }

  @Override
  protected void onPause() {
    if (mWL.isHeld())
      mWL.release();
    mView.onPause();
    super.onPause();
  }

  @Override
  protected void onResume() {
    super.onResume();
    mView.onResume();
    if (!mWL.isHeld()) mWL.acquire();
  }
}


// View
class MainView extends GLSurfaceView {
  MainRenderer mRenderer;

  MainView(Context context) {
    super(context);
    mRenderer = new MainRenderer(this);
    setEGLContextClientVersion(2);
    setRenderer(mRenderer);
    setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
  }

  public void surfaceCreated(SurfaceHolder holder) {
    super.surfaceCreated(holder);
  }

  public void surfaceDestroyed(SurfaceHolder holder) {
    mRenderer.close();
    super.surfaceDestroyed(holder);
  }

  public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
    super.surfaceChanged(holder, format, w, h);
  }

}


// Renderer
class MainRenderer implements GLSurfaceView.Renderer, SurfaceTexture.OnFrameAvailableListener {
  private final String vss =
    "attribute vec2 vPosition;\n" +
      "attribute vec2 vTexCoord;\n" +
      "varying vec2 texCoord;\n" +
      "void main() {\n" +
      "  texCoord = vTexCoord;\n" +
      "  gl_Position = vec4 ( vPosition.x, vPosition.y, 0.0, 1.0 );\n" +
      "}";

  private final String fss =
    "#extension GL_OES_EGL_image_external : require\n" +
      "precision mediump float;\n" +
      "uniform samplerExternalOES sTexture;\n" +
      "varying vec2 texCoord;\n" +
      "void main() {\n" +
      "  gl_FragColor = texture2D(sTexture,texCoord);\n" +
      "}";

  private int[] hTex;
  private FloatBuffer pVertex;
  private FloatBuffer pTexCoord;
  private int hProgram;

  private Camera mCamera;
  private SurfaceTexture mSTexture;

  private boolean mUpdateST = false;

  private MainView mView;

  private FrameBuffer fb = null;
  private World world = null;
  private Object3D model = null;
  private Light sun = null;

  MainRenderer(MainView view) {
    mView = view;
    float[] vtmp = {1.0f, -1.0f, -1.0f, -1.0f, 1.0f, 1.0f, -1.0f, 1.0f};
    float[] ttmp = {1.0f, 1.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 0.0f};
    pVertex = ByteBuffer.allocateDirect(8 * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
    pVertex.put(vtmp);
    pVertex.position(0);
    pTexCoord = ByteBuffer.allocateDirect(8 * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
    pTexCoord.put(ttmp);
    pTexCoord.position(0);
  }

  public void close() {
    mUpdateST = false;
    mSTexture.release();
    mCamera.stopPreview();
    mCamera.release();
    mCamera = null;
    deleteTex();
  }

  public void onSurfaceCreated(GL10 unused, EGLConfig config) {
    System.out.println("onSurfaceCreated");

    //String extensions = GLES20.glGetString(GLES20.GL_EXTENSIONS);
    //Log.i("mr", "Gl extensions: " + extensions);
    //Assert.assertTrue(extensions.contains("OES_EGL_image_external"));

    initTex();
    mSTexture = new SurfaceTexture(hTex[0]);
    mSTexture.setOnFrameAvailableListener(this);

    mCamera = Camera.open();
    try {
      mCamera.setPreviewTexture(mSTexture);
    } catch (IOException ioe) {
    }

    GLES20.glClearColor(1.0f, 1.0f, 0.0f, 1.0f);

    hProgram = loadShader(vss, fss);

    initModel();
  }

  public void onDrawFrame(GL10 unused) {
    System.out.println("onDrawFrame");

    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

    fb.clear();
    world.renderScene(fb);
    world.draw(fb);
    fb.display();

    synchronized (this) {
      if (mUpdateST) {
        mSTexture.updateTexImage();
        mUpdateST = false;
      }
    }

    GLES20.glUseProgram(hProgram);

    int ph = GLES20.glGetAttribLocation(hProgram, "vPosition");
    int tch = GLES20.glGetAttribLocation(hProgram, "vTexCoord");
    int th = GLES20.glGetUniformLocation(hProgram, "sTexture");

    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, hTex[0]);
    GLES20.glUniform1i(th, 0);

    GLES20.glVertexAttribPointer(ph, 2, GLES20.GL_FLOAT, false, 4 * 2, pVertex);
    GLES20.glVertexAttribPointer(tch, 2, GLES20.GL_FLOAT, false, 4 * 2, pTexCoord);
    GLES20.glEnableVertexAttribArray(ph);
    GLES20.glEnableVertexAttribArray(tch);

    GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
//    GLES20.glFlush();
  }

  private void initModel() {
    world = new World();
    world.setAmbientLight(20, 20, 20);

    sun = new Light(world);
    sun.setIntensity(250, 250, 250);

    Texture texture = new Texture(BitmapHelper.rescale(BitmapHelper.convert(App.context.getResources().getDrawable(R.drawable.monster)), 512, 512));
    TextureManager.getInstance().addTexture("texture", texture);

    model = loadModel(R.raw.monster, 1);
    model.setTexture("texture");
    model.build();

    world.addObject(model);

    com.threed.jpct.Camera cam = world.getCamera();
    cam.moveCamera(com.threed.jpct.Camera.CAMERA_MOVEOUT, 50);
    cam.lookAt(model.getTransformedCenter());

    SimpleVector sv = new SimpleVector();
    sv.set(model.getTransformedCenter());
    sv.y -= 100;
    sv.z -= 100;
    sun.setPosition(sv);
    MemoryHelper.compact();
  }

  private Object3D loadModel(int filename, float scale) {
    InputStream stream = App.context.getResources().openRawResource(filename);
    Object3D[] model = Loader.load3DS(stream, scale);
    Object3D o3d = new Object3D(0);
    Object3D temp = null;
    for (int i = 0; i < model.length; i++) {
      temp = model[i];
      System.out.println("model:" + temp.getName());
      temp.setCenter(SimpleVector.ORIGIN);
      temp.rotateX((float) (-.5 * Math.PI));
      temp.rotateMesh();
      temp.setRotationMatrix(new Matrix());
      o3d = Object3D.mergeObjects(o3d, temp);
      o3d.build();
    }
    return o3d;
  }


  public void onSurfaceChanged(GL10 unused, int width, int height) {
    System.out.println("onSurfaceChanged");

    if (fb != null) {
      fb.dispose();
    }
    fb = new FrameBuffer(unused, width, height);

    GLES20.glViewport(0, 0, width, height);
    Camera.Parameters param = mCamera.getParameters();
    List<Camera.Size> psize = param.getSupportedPreviewSizes();
    if (psize.size() > 0) {
      int i;
      for (i = 0; i < psize.size(); i++) {
        if (psize.get(i).width < width || psize.get(i).height < height)
          break;
      }
      if (i > 0)
        i--;
      param.setPreviewSize(psize.get(i).width, psize.get(i).height);
      //Log.i("mr","ssize: "+psize.get(i).width+", "+psize.get(i).height);
    }
    param.set("orientation", "landscape");
    mCamera.setParameters(param);
    mCamera.startPreview();
  }

  private void initTex() {
    hTex = new int[1];
    GLES20.glGenTextures(1, hTex, 0);
    GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, hTex[0]);
    GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
    GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
    GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
    GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
  }

  private void deleteTex() {
    GLES20.glDeleteTextures(1, hTex, 0);
  }

  public synchronized void onFrameAvailable(SurfaceTexture st) {
    mUpdateST = true;
    mView.requestRender();
  }

  private static int loadShader(String vss, String fss) {
    int vshader = GLES20.glCreateShader(GLES20.GL_VERTEX_SHADER);
    GLES20.glShaderSource(vshader, vss);
    GLES20.glCompileShader(vshader);
    int[] compiled = new int[1];
    GLES20.glGetShaderiv(vshader, GLES20.GL_COMPILE_STATUS, compiled, 0);
    if (compiled[0] == 0) {
      Log.e("Shader", "Could not compile vshader");
      Log.v("Shader", "Could not compile vshader:" + GLES20.glGetShaderInfoLog(vshader));
      GLES20.glDeleteShader(vshader);
      vshader = 0;
    }

    int fshader = GLES20.glCreateShader(GLES20.GL_FRAGMENT_SHADER);
    GLES20.glShaderSource(fshader, fss);
    GLES20.glCompileShader(fshader);
    GLES20.glGetShaderiv(fshader, GLES20.GL_COMPILE_STATUS, compiled, 0);
    if (compiled[0] == 0) {
      Log.e("Shader", "Could not compile fshader");
      Log.v("Shader", "Could not compile fshader:" + GLES20.glGetShaderInfoLog(fshader));
      GLES20.glDeleteShader(fshader);
      fshader = 0;
    }

    int program = GLES20.glCreateProgram();
    GLES20.glAttachShader(program, vshader);
    GLES20.glAttachShader(program, fshader);
    GLES20.glLinkProgram(program);

    return program;
  }
}



Sample project: https://www.dropbox.com/s/d85q88bx27m71oj/CameraTest.zip?dl=0

Is my approach feasible? Please give me some advice, thank you!

EgonOlsen

Quoteit throws an ArrayIndexOutOfBoundsException

That's not very helpful at all. Some stack trace might help, but to be honest, your code is a wild mix of jPCT-AE and direct GL calls. Even without the exception, it's very unlikely that this will work, because you are pulling the rug from under the engine that way and you'll most likely end up with a undefined state and all kinds of errors and rendering bugs. If you want to mix jPCT-AE and some external image, there's a method for this: http://www.jpct.net/jpct-ae/doc/com/threed/jpct/Texture.html#setExternalId(int, int)

Here's a thread that deal with this. Maybe it helps: http://www.jpct.net/forum2/index.php/topic,3794.0.html

jaychang0917

Thanks for your help. I went through the thread you provided, it is really helpful. But I encounter the following error. It seems cant compile the vertex shader.

java.lang.RuntimeException: failed creating program
    at com.blogspot.geekonjava.dsmodel_ar.TextureRenderer.surfaceCreated(TextureRenderer.java:117)
    at com.blogspot.geekonjava.dsmodel_ar.MainActivity$MyRenderer.onSurfaceChanged(MainActivity.java:203)
    at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1511)
    at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1239)


Here is my code:

TextureRenderer
https://android.googlesource.com/platform/cts/+/master/tests/tests/media/src/android/media/cts/TextureRender.java



package com.blogspot.geekonjava.dsmodel_ar;

import android.app.Activity;
import android.graphics.PixelFormat;
import android.graphics.SurfaceTexture;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
import android.opengl.GLU;
import android.os.Bundle;
import android.view.MotionEvent;

import com.example.dsmodel_ar.R;
import com.threed.jpct.Camera;
import com.threed.jpct.FrameBuffer;
import com.threed.jpct.Light;
import com.threed.jpct.Loader;
import com.threed.jpct.Logger;
import com.threed.jpct.Matrix;
import com.threed.jpct.Object3D;
import com.threed.jpct.SimpleVector;
import com.threed.jpct.Texture;
import com.threed.jpct.TextureManager;
import com.threed.jpct.World;
import com.threed.jpct.util.BitmapHelper;
import com.threed.jpct.util.MemoryHelper;

import java.io.IOException;
import java.io.InputStream;
import java.lang.reflect.Field;

import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;

public class MainActivity extends Activity {

  // Used to handle pause and resume...
  private static MainActivity master = null;

  private GLSurfaceView mGLView;
  private MyRenderer renderer = null;
  private FrameBuffer fb = null;
  private World world = null;

  private float touchTurn = 0;
  private float touchTurnUp = 0;

  private float xpos = -1;
  private float ypos = -1;

  private Object3D model = null;
  private Light sun = null;

  private android.hardware.Camera mCamera;

  private SurfaceTexture surfaceTexture;
  private boolean frameAvailable = false;
  private Texture externalTexture;
  private TextureRenderer textureRenderer = new TextureRenderer();

  protected void onCreate(Bundle savedInstanceState) {
    Logger.log("onCreate");

    if (master != null) {
      copy(master);
    }

    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_main);

    mGLView = (GLSurfaceView) findViewById(R.id.surfaceView);
    mGLView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
    mGLView.getHolder().setFormat(PixelFormat.TRANSLUCENT);

    renderer = new MyRenderer();
    mGLView.setRenderer(renderer);

    this.externalTexture = new Texture(32, 32);
    TextureManager.getInstance().flush();
    TextureManager.getInstance().addTexture("video_texture", externalTexture);
  }

  @Override
  protected void onPause() {
    super.onPause();
    mGLView.onPause();
  }

  @Override
  protected void onResume() {
    super.onResume();
    mGLView.onResume();
  }

  @Override
  protected void onStop() {
    super.onStop();
  }

  private void copy(Object src) {
    try {
      Logger.log("Copying data from master Activity!");
      Field[] fs = src.getClass().getDeclaredFields();
      for (Field f : fs) {
        f.setAccessible(true);
        f.set(this, f.get(src));
      }
    } catch (Exception e) {
      throw new RuntimeException(e);
    }
  }

  public boolean onTouchEvent(MotionEvent me) {

    if (me.getAction() == MotionEvent.ACTION_DOWN) {
      xpos = me.getX();
      ypos = me.getY();
      return true;
    }

    if (me.getAction() == MotionEvent.ACTION_UP) {
      xpos = -1;
      ypos = -1;
      touchTurn = 0;
      touchTurnUp = 0;
      return true;
    }

    if (me.getAction() == MotionEvent.ACTION_MOVE) {
      float xd = me.getX() - xpos;
      float yd = me.getY() - ypos;

      xpos = me.getX();
      ypos = me.getY();

      touchTurn = xd / -100f;
      touchTurnUp = yd / -100f;
      return true;
    }

    try {
      Thread.sleep(15);
    } catch (Exception e) {
      // No need for this...
    }

    return super.onTouchEvent(me);
  }

  class MyRenderer implements GLSurfaceView.Renderer {

    private long time = System.currentTimeMillis();

    public MyRenderer() {
    }

    public void onSurfaceChanged(GL10 gl, int w, int h) {
      if (fb != null) {
        fb.dispose();
      }
      fb = new FrameBuffer(gl, w, h);

      if (master == null) {

        world = new World();
        world.setAmbientLight(20, 20, 20);

        sun = new Light(world);
        sun.setIntensity(250, 250, 250);

        Texture texture = new Texture(BitmapHelper.rescale(BitmapHelper.convert(getResources().getDrawable(R.drawable.monster)), 512, 512));
        TextureManager.getInstance().addTexture("texture", texture);

        model = loadModel(R.raw.monster, 1);
        model.setTexture("texture");
        model.build();

        world.addObject(model);

        Camera cam = world.getCamera();
        cam.moveCamera(Camera.CAMERA_MOVEOUT, 50);
        cam.lookAt(model.getTransformedCenter());

        SimpleVector sv = new SimpleVector();
        sv.set(model.getTransformedCenter());
        sv.y -= 100;
        sv.z -= 100;
        sun.setPosition(sv);
        MemoryHelper.compact();

        if (master == null) {
          Logger.log("Saving master Activity!");
          master = MainActivity.this;
        }
      }


      if (surfaceTexture != null) {
        surfaceTexture.release();
      }

      textureRenderer.surfaceCreated();
      surfaceTexture = new SurfaceTexture(textureRenderer.getTextureId());
      externalTexture.setExternalId(textureRenderer.getTextureId(), GLES11Ext.GL_TEXTURE_EXTERNAL_OES);
      surfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
        @Override
        public void onFrameAvailable(SurfaceTexture surfaceTexture) {
          synchronized (MainActivity.this) {
            frameAvailable = true;
          }
        }
      });

      MemoryHelper.compact();


      mCamera = android.hardware.Camera.open();
      try {
        mCamera.setPreviewTexture(surfaceTexture);
        mCamera.startPreview();
      } catch (IOException ioe) {
        // Something bad happened
      }
    }

    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
    }

    public void onDrawFrame(GL10 gl) {
      if (touchTurn != 0) {
        model.rotateY(touchTurn);
        touchTurn = 0;
      }

      if (touchTurnUp != 0) {
        model.rotateX(touchTurnUp);
        touchTurnUp = 0;
      }

      fb.clear();
      world.renderScene(fb);
      world.draw(fb);
      fb.display();

      synchronized (this) {
        if (frameAvailable) {
          int error = GLES20.glGetError();
          if (error != 0) {
            System.out.println("gl error before updateTexImage" + error + ": " + GLU.gluErrorString(error));
          }
          surfaceTexture.updateTexImage();
          frameAvailable = false;
        }
      }
    }

    private Object3D loadModel(int filename, float scale) {
      InputStream stream = getResources().openRawResource(filename);
      Object3D[] model = Loader.load3DS(stream, scale);
      Object3D o3d = new Object3D(0);
      Object3D temp = null;
      for (int i = 0; i < model.length; i++) {
        temp = model[i];
        System.out.println("model:" + temp.getName());
        temp.setCenter(SimpleVector.ORIGIN);
        temp.rotateX((float) (-.5 * Math.PI));
        temp.rotateMesh();
        temp.setRotationMatrix(new Matrix());
        o3d = Object3D.mergeObjects(o3d, temp);
        o3d.build();
      }
      return o3d;
    }
  }
}


EgonOlsen

Maybe...but your code is full of log outputs. It should actually print out if something goes wrong and what exactly it is.

EgonOlsen

And by the way: FrameBuffer has two getPixels()-methods. Use these to read the screen's content. There's no need to use glReadPixel directly, because as said: jPCT-AE manages it's own state. Fiddling around with GL directly might screw this up.

jaychang0917

#5
Quote from: EgonOlsen on February 10, 2017, 07:05:13 PM
Maybe...but your code is full of log outputs. It should actually print out if something goes wrong and what exactly it is.

I figured out that error, It is because I haven't call setEGLContextClientVersion(2);. But it still cant compile the shader, it throws

02-11 12:20:09.527 2234-2277/com.example.dsmodel_ar E/AndroidRuntime: FATAL EXCEPTION: GLThread 19862
    Process: com.example.dsmodel_ar, PID: 2234
    java.lang.RuntimeException: glCreateShader type=35633: glError 1280
        at com.blogspot.geekonjava.dsmodel_ar.TextureRenderer.checkGlError(TextureRenderer.java:212)
        at com.blogspot.geekonjava.dsmodel_ar.TextureRenderer.loadShader(TextureRenderer.java:166)
        at com.blogspot.geekonjava.dsmodel_ar.TextureRenderer.createProgram(TextureRenderer.java:180)
        at com.blogspot.geekonjava.dsmodel_ar.TextureRenderer.surfaceCreated(TextureRenderer.java:115)
        at com.blogspot.geekonjava.dsmodel_ar.MainActivity$MyRenderer.onSurfaceChanged(MainActivity.java:202)
        at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1511)
        at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1239)

EgonOlsen

Ok, but that has nothing to do with jPCT-AE. If your shader doesn't compile because of a syntax error, you should actually get a human readable message about it. In your case, you are getting a 1280, which mean invalid enum. Something in your setup code is wrong.

jaychang0917

#7
I managed to render Android camera preiview texture to GLSurfaceView now. However, only camera content or 3d model can be shown but BOTH. The following code results only showing the 3d model,
but what I want is the 3d model is shown on top of the camera preview.

@Override
  public synchronized void onDrawFrame(GL10 gl) {
    GLES20.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

    //render the camera texture
    if (updateTexture) {
      mSurfaceTexture.updateTexImage();
      mSurfaceTexture.getTransformMatrix(mTransformM);

      updateTexture = false;

      GLES20.glViewport(0, 0, mWidth, mHeight);

      mOffscreenShader.useProgram();

      int uTransformMLoc = mOffscreenShader.getHandle("uTransformM");
      int uOrientationMLoc = mOffscreenShader.getHandle("uOrientationM");
      int uRatioVLoc = mOffscreenShader.getHandle("ratios");

      GLES20.glUniformMatrix4fv(uTransformMLoc, 1, false, mTransformM, 0);
      GLES20.glUniformMatrix4fv(uOrientationMLoc, 1, false, mOrientationM, 0);
      GLES20.glUniform2fv(uRatioVLoc, 1, mRatio, 0);

      GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
      GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mCameraTexture.getTextureId());

      renderQuad(mOffscreenShader.getHandle("aPosition"));
    }

    //render 3d model
    frameBuffer.clear();
    world.renderScene(frameBuffer);
    world.draw(frameBuffer);
    frameBuffer.display();
  }


Sorry for my late reply. EgonOlsen, can you help take a look?
The full sample project: https://www.dropbox.com/s/bzgnmholz9z030i/CameraView1.zip?dl=0

jaychang0917

#8
It seems the model is covered by the camera preview. If I reduce the camera preview area, the 3d model is shown (see attachment). I have no idea what is going wrong.

EgonOlsen

You are clearing the buffer after rendering the quad (again: Why are you still mixing jPCT-AE and GL calls? That's asking for all kinds of trouble. You don't have to do that you are not supposed to do it either...anyway...).
Try to replace


frameBuffer.clear();


with


frameBuffer.clearZBufferOnly();

jaychang0917

#10
QuoteWhy are you still mixing jPCT-AE and GL calls?
It is because I only found this way to render camera texture to the GLSurfaceView. Is there are method like "addStreamingBackground()" in jPCT-AE?

clearZBufferOnly
this works!!

jaychang0917

I found that fps will be dropped from 60fps to 30fps when both camera texture and 3d model are rendered. Is there any improvement that I can make?

EgonOlsen

30 is the next step down from 60, because all Android devices are using vertical synchronization. That means: If your app can output 80 fps, it will display 60fps. If it can only output 59fps, it will output 30fps. There's nothing you can do about that.

Which device are we talking about here?

EgonOlsen

Quote from: jaychang0917 on February 13, 2017, 10:23:29 AM
It is because I only found this way to render camera texture to the GLSurfaceView. Is there are method like "addStreamingBackground()" in jPCT-AE?
No, but you could setup your external texture (in that case, you need some GL calls at setup time), assign it to a jPCT-AE managed texture (as mentioned earlier) and blit it to the background in each frame.

Anyway, as long as your code works on all devices, you might want to stick with it for now. Just don't expect me to offer support for the parts that are native GL calls or for any side-effects that may occur.

jaychang0917

QuoteWhich device are we talking about here?
LG-D858