One World, Two OpenGL Contexts

Started by nathan, April 10, 2013, 11:23:28 PM

Previous topic - Next topic

nathan

I am trying to render the same scene onto two displays, using Android's Presentation mode.  Because of Presentation mode, I am required to have two OpenGL threads and contexts.

I, of course, do not want to have two Worlds and two of every object.  So I created one world, and had both scenes render it.  Of course this would cause threading issues, so I synchronized the calls in a method called doRender()  Only one thread is ever reading the World data, and I have confirmed this with logs.

This seems to run just fine.  And, every time my doRender() function is called, I get this at the begining:
I/jPCT-AE ( 1784): OpenGL context has changed(2)...recovering for renderer 1/0!
I/jPCT-AE ( 1784): Creating buffers...

I believe this is because the World knows that it is being asked to render to a different context, and so it loads different texture buffers, etc.  Is that right?  I know this is not ideal, but it may be necessary.

And, this runs for a few minutes, but then will crash. And I get this:
I/jPCT-AE ( 1976): OpenGL context has changed(2)...recovering for renderer 0/1!
E/SurfaceTextureClient( 1976): queueBuffer: error queuing buffer to SurfaceTexture, -12
I/jPCT-AE ( 1976): Creating buffers...
W/GLThread( 1976): eglSwapBuffers failed: EGL_BAD_NATIVE_WINDOW
F/libc    ( 1976): Fatal signal 11 (SIGSEGV) at 0x00000000 (code=1), thread 1990 (Thread-145)

And it prints a bunch of debug information
This seems to happen near the 12750th call to doRender

Why is this crashing?  Is it allocating too much memory every time the buffers are created?

Is there a way to tell JPCT that it will constantly be switching between two GL contexts?

nathan

Using android's top program
$ top
  PID PR CPU% S  #THR     VSS     RSS PCY UID      Name
1182  1   1% S    12 765720K 316980K  fg u0_a55   <my application>

I see that the memory allocated for my app constantly increases.  And I am not doing anything else in this app that would allocate memory.

EgonOlsen

You can't do it this way. A world has objects and these objects have native resources allocated for rendering which are bound to a context. Rendering the same world into different contexts causes these buffers to be recreated each frame and obviously somehow causes a memory leak (i'm not sure why, but it doesn't really matter because it won't work well anyway).
You might want to try to set Config.useVBO to false and see if that helps. That will decouple the objects from native resources .