[Tips] Android, augmented reality 3D with JPCT + Camera.

Started by dl.zerocool, May 11, 2010, 08:06:35 PM

Previous topic - Next topic

EgonOlsen

The AAConfigChooser doesn't specify an alpha value for the config, which means that it defaults to 0. I can change this and give you an updated version to try later.

AugTech


EgonOlsen

Please try this jar: http://www.jpct.de/download/beta/jpct_ae.jar. It has an additional constructor that takes a boolean to enable/disable alpha. I haven't tested it, i'm just setting the configuration and hoping for the best...

AugTech

Brilliant - thats done it!

The final setup within the SurfaceView is...

setEGLContextClientVersion(2);
setEGLConfigChooser(new AAConfigChooser(this, true));
getHolder().setFormat(PixelFormat.TRANSLUCENT);
//setEGLConfigChooser(8, 8, 8, 8, 16, 8);//Tablet compatible (RETEST)

// Setup the renderer
String inC = Globals.App_Prefs.getString("BackgroundColour", "0,0,0,0");
RGBColor background = Utilities.decodeColour(inC);
mRenderer = new JpctRenderer(background);
setRenderer(mRenderer);


All I have to do now is align the objects to the world without billboarding!

Cheers.

[attachment deleted by admin]


AugTech

Egon,
Unfortunately some test users (including myself) are getting an IndexOutOfBounds exception within the AAConfigChooser() now.
This code

    setEGLContextClientVersion(2);
    if (Main.isTablet()) {
    setEGLConfigChooser(8, 8, 8, 8, 16, 8);
    } else {
    setEGLConfigChooser(new AAConfigChooser(this, true));
    }
getHolder().setFormat(PixelFormat.TRANSLUCENT);


has been working fine for months, but now running it on lesser devices to my Galaxy SII results in the following stack trace on start-up:

FATAL EXCEPTION: GLThread 9
java.lang.ArrayIndexOutOfBoundsException
at com.threed.jpct.util.AAConfigChooser.chooseConfig(AAConfigChooser.java:131)
at android.opengl.GLSurfaceView$EglHelper.start(GLSurfaceView.java:918)
at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1248)
at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1118)


I must add that I have had to download a more recent beta of jPCT-AE as my dev environment died and I had to re-acquire several libraries - I don't know if this would make a difference...

Any thoughts/ tips would be great.

EgonOlsen

Seems like a flaw...when no matching config was found, it should choose the first one....but it chooses the -1th one instead, which fails. This version should fix this: http://jpct.de/download/beta/jpct_ae.jar. However, i've no idea what this first config actually will be and if it fits...

AugTech

Thanks Egon, that's done the trick.

As an aside, you wouldn't have a good bit of code for working out the actual available memory for the running application?  (Just wonder if you do anything already in jPCT in terms of memory checking)

Thanks again,

Mike

GLeonio

Thanks for the helpful tips! I compiled a list of some top resources I found around this topic of 3D Android augmented reality. I included this post. Check it out. I hope it can be useful to other developers here.  :) http://www.verious.com/board/Giancarlo-Leonio/3d-android-augmented-reality/

raft

Quote from: dl.zerocool on May 11, 2010, 08:06:35 PM
First we need to set up an XML layout.
Our minimum requirement is a glSurfaceView that's where we will draw 3D(JPCT engine),
and a SurfaceView to draw the camera preview.

<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical" android:layout_width="fill_parent"
android:layout_height="fill_parent">

<android.opengl.GLSurfaceView android:id="@+id/glsurfaceview"
android:layout_width="fill_parent" android:layout_height="fill_parent" />

<SurfaceView android:id="@+id/surface_camera"
android:layout_width="fill_parent" android:layout_height="fill_parent"
android:layout_centerInParent="true" android:keepScreenOn="true" />
</FrameLayout>



I'm experimenting with this idea, very useful post:)

what I dont get is, in this layout camera view should be on top of GL view, not the other way around. but it's not ??? furthermore if I switch the order, GL view disappears  ::)

what am I missing?

AugTech

#55
Unfortunately I can't answer your question directly (as to why), but I struggled for sometime getting a stable implementation whereby the AR data was always on top of the camera, especially whilst toggling the camera preview on and off.
I always implemented this in code, not using a layout as follows;

In the main activity onCreate method, in this order;

Create the CameraPreview.
Create the SurfaceView (with renderer)

In the CameraPreview constructor;

    RelativeLayout tmpLayout = new RelativeLayout( appContext );
    tmpLayout.addView(this, 0,
    new LayoutParams(
    LayoutParams.FILL_PARENT,
    LayoutParams.FILL_PARENT)
    );
    activity.setContentView(tmpLayout);


In the SurfaceView constructor;

        setEGLContextClientVersion(2);

setEGLConfigChooser(new AAConfigChooser(this, true));

getHolder().setFormat(PixelFormat.TRANSLUCENT);

// Setup the renderer
mRenderer = new JpctRenderer( mainARView.orientationListener.tabletMode );
Activity activity = (Activity)mainARView.appContext;

// Add this view to the main activity
        setZOrderMediaOverlay(true);
    activity.addContentView(this, new LayoutParams(
    LayoutParams.FILL_PARENT,
    LayoutParams.FILL_PARENT));


If you want a UI on-top of both, do this in the main activity after creating the SurfaceView;

uiScreen = new UserInterfaceView(this);
addContentView(uiScreen, fillLayout);


In onResume;

camScreen.start();
augScreen.setVisibility(View.VISIBLE);
augScreen.onResume();


In onPause;

augScreen.setVisibility(View.INVISIBLE);
augScreen.onPause();
camScreen.stop();


Hope this helps.

M

raft

thanks:) I guess making layout either from xml or code ends up with same thing.

I believe the right way of doing this is, somehow making camera render to GLSurfaceView. anyone did this?

AugTech

You can do this using the Vuforia SDK - That provides the image recognition and camera rendering via one 'renderFrame()' call which goes in the onDrawFrame method along with Jpct.
There's a pretty good article on the Wiki which I followed (mostly) to do just that.

raft

yes, I saw that wiki page. actually rendering both into the same surface idea came from there.

but I cannot use it. is it open source? rendering code may help

AugTech

The QCAR/ Vuforia is not OpenSource, but it is free to use in private and commercial apps; https://developer.vuforia.com/legal/license/2-8

Which bit can't you use (commercial reasons or technical?)



frameBuffer.clear(background);

/* Call our native QCAR function to render the video
* background and scan images for recognition */
renderFrame();

// Render the AR data to the jPCT scene
theWorld.renderScene(frameBuffer);
theWorld.draw(frameBuffer);

// Display our content...
frameBuffer.display();