[Tips] Android, augmented reality 3D with JPCT + Camera.

Started by dl.zerocool, May 11, 2010, 08:06:35 PM

Previous topic - Next topic

raft

I wasnt aware Vuforia is free for commercial apps :)

Quote
Which bit can't you use (commercial reasons or technical?)
both I suppose, our pipeline depends on OpenCV based detection. OpenCV's Android library renders onto Canvas (ie: SurfaceView not GLSurfaceView) and here I'm to merge them into one ;)

AugTech

Wasn't aware OpenCV had an Android implementation.. Have to look that up when I have the change.

Can't help with the canvas much, but with Vuforia you could always nobble the part that actually checks for Trackable's and just utilise the camera preview segments, although possibly a bit heavy handed.


JNIEXPORT void JNICALL
Java_com_augtech_awilaSDK_graphics_JpctRenderer_renderFrame(JNIEnv* env, jobject obj) {

   jclass activityClass = env->GetObjectClass(obj); //We get the class of our graphics engine

    // Clear color and depth buffer
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    // Get the state from QCAR and mark the beginning of a rendering section
    QCAR::State state = QCAR::Renderer::getInstance().begin();

    // Explicitly render the Video Background
    QCAR::Renderer::getInstance().drawVideoBackground();

mxar

Hi,

I'm working in a Augmented Reality project using jPCT-AE as the rendering engine.


I need to load dynamically on the camera surface a 3D Object from a data buffer (memory) not from  the raw or assets directory.

Is it possible?

Thanks in advance

raft

sure, jPCT's Loader class works on InputStream's, it does not care about the source of the stream.

mxar


Thanks for your answer but

what about the texture images?

I dont want to load them from raw or assets folder.

I want to load them from the memory (data bufffer).


Thanks in advance.


raft

yes, it's also possible. I've used camera and media player's output as texture of a 3d object by using GL_TEXTURE_EXTERNAL_OES.

have a look at this thread:
http://www.jpct.net/forum2/index.php/topic,3794.0.html

mxar



In the project  i have  to show dynamically 3d objects on the surface of the camera.

The renderer engine is notified by AR engine that a new 3d Object is ready to be displayed on camera's surface.
The 3d object is stored by AR engine dynamically in a data buffer.

So the renderer engine reads from the data buffer the content, parses the content, and then must show the 3d object on camera's surface.

Do you think that this will work with jPCT-AE?

Before parsing the contents of the data buffer I must know the type of 3d Object (.3ds,.obj ...)?

I need to specify the location of texture images?


Thanks in Advance





raft

if you mean rendering into android's camera view i guess that's not possible. some people (for example as in the beginning of this thread) put camera view and GLSurfaceView (the view jPCT renders into) on top of each other. IMHO, this is not the correct way. I take camera data and blit it into GLSurfaceView as background. then draw any 3d objects on top of that as I like.

somehere in the forum you can find information about that


mxar


Actually the 3D Objects are rendered on GLSurfaceView surface.

I use two surfaces: the camera and the GLSurfaceView surface one on top of other.

So what do you think , is it possible to load dynamically 3D objects on  GLSurfaceView ?.

These 3d Objects are stored in a remote server and under some conditions the AR engine downloads them.

As i described before, the AR then fills a data buffer with the content of the downloaded 3D object and notifies the renderer engine.

The renderer engine then reads the contents of the data buffer and draws the 3D Object on GLSurfaceView.

I must know the specific type of 3D object (.3ds, .obj .. ) before reading and parsing the contents of data buffer?

What about the texture images? are not stored now in assets or raw folder.

Thanks in advance.





raft

yes, you can do that. as i said jPCT works on streams, it doesnt care about the source of stream. for the type of the object, maybe AR engine puts another bit of data describing its type. inspecting first bytes of stream may also help

mxar

 Thanks for the help.
I'm sure your answer will be usefull to complete the project.


mxar

Hi,

I want to blit a captured image from camera to a FrameBuffer object in the onDraw() method.
On top of this image i want to display some 3d objects.It is an Augmented Reality application.

What is the best way to blit a captured image from camera to FrameBuffer?
The faster way?

Thanks in advance