Why does jPCT-AE need to keep the texture in the VM memory?

Started by AeroShark333, December 01, 2016, 04:59:09 AM

Previous topic - Next topic

AeroShark333

Hello,

Well I guess the title explains all...
I wondered because when textures get uploaded to the GPU (even when pixeldata is kept so it should be safe for any context change) why does it still need the texture data in the VM memory?

Is it possible to not have the textures in VM memory but uploaded to the GPU only?

Cheers,
AeroShark333

EgonOlsen

It needs it for managing re-uploads after a context change. You can change that behaviour with the ...keepPixels...methods in Texture, but you have to reinitialize es everything on your own again in case of a context change then. I don't think it's worth it...

AeroShark333

Hmmm, well (re-)uploads seem like the most time-consuming task, especially with bigger textures I suppose...?
I don't know if jPCT still re-uploads Textures that are actually still in the GPU with keepPixels enabled... (seems pretty useless to me? I don't know)
Hmhm, why wouldn't it be worth it? I don't really completely understand it I guess.

EgonOlsen

I'm not sure if I got the question correctly. It uploads textures when it has to. Not more and not less. It has to do it on a context change, which basically means every time you create a new FrameBuffer instance. You can minimize the creations of new FrameBuffers and thus the context changes by adding this to your GL init code:


mGLView.setPreserveEGLContextOnPause(true);


and then in onSurfaceChanged() check if the given GL10 instance equals the last one. It it does, you can reuse the FrameBuffer. If it doesn't, you have to create a new one. It doesn't matter if you are actually using that instance (i.e. if you are using GLES 1.x or GLES 2.0), the test is still valid.

That applies to normal apps. For wallpapers (just in case we are talking these here), I don't know if that's applicable.

AeroShark333

I guess I understood that, but aren't there scenarios when textures don't have to be re-uploaded? (so the textures don't need to stay in VM any more, right?)
So isn't it possible to clear the texture data in VM? Setting the texture data instances to null I suppose, so the GC can pick it up...

EgonOlsen

As mentioned, you can use the keepPixels methods to discard the VM copy. But once the context changes then, you are screwed and your textures will all turn into white (in the best case).

AeroShark333

Does jPCT automatically 'discard' these VM copies? Or do I have to do this myself?
If so, how?

EgonOlsen

No, it keeps them. Unless you tell it not to by using these keepPixels() methods. But honestly... Why do you want to do all this? It's usually asking for more trouble than it's worth.

AeroShark333

Well I'm using high resolution textures and VM memory is pretty limited on Android devices...

EgonOlsen

You could use the Virtualizer class, if you really want to.

AeroShark333

Hmmm, so I did some testing... And I wondered, why can jPCT not use the uploaded textures in the GPU when the context has changed? After all... the 2D texture images don't change, right?

EgonOlsen

Because it just doesn't work. Once the context is lost, GPU memory is empty. It's been reset.

AeroShark333

#12
Using the Virtualizer class would significantly slow down loading time, right?
Especially with big resolution textures...?

Sooo, I tried this with my live wallpaper:
gLView.setPreserveEGLContextOnPause(true);
And it seems to work fine I guess; even when not keeping textures in VM.

When the context changes, it seems that the GLSurfaceView and Renderer will get disposed and will get recreated.
Textures will be reuploaded so that's fine I guess.
Context changes aren't supposed to happen anyway I think, but it seems to happen in the Live Wallpaper picker for some reason when changing the orientation...
While context changes do not seem to happen, when applied in a certain launcher and then changing the orientation.

So now I basically only load textures when the Live wallpaper is launched (and on context changes).
This loading process is a one time thing only but it does seem to take up some time...
Also I noticed that when I'm adding textures to the texturemanager, the VM copy is still kept until the texture is actually used on a visible Object3D in a World.draw() call.

Is it somehow possible to upload textures to the GPU immediately when the texture is added to the texturemanager? (except the textures that are only being blit since I guess they don't need to be uploaded to the GPU...)
So I thought this would be possible by using a world.draw() call after an Object3D with a texture is added to the world.
But then it also needs to be visible in order to get the texture uploaded to the GPU; So is there some way to make all objects in my world 'visible'?
Sooner or later all my objects are visible anyway...

So how it now works:
[add Texture to texturemanager -> apply Texture to Object3D -> add Object3D to world -> world.draw() -> rendering starts -> after a few seconds there is a hiccup because the added Object3D has become visible -> uploading texture to GPU -> continue rendering]
And what I want:
[add Texture to TextureManager (as VM copy) -> apply texture to Object3D -> add Object3D to world -> upload texture to GPU -> rendering starts]
I think I could possibly achieve this my pointing the camera at the Object3D so it becomes visible:
[add Texture to TextureManager (as VM copy) -> apply texture to Object3D -> add Object3D to world -> make Camera face the Object3D -> upload texture to GPU -> rendering starts]

And since I guess I never actually really need a VM copy of these textures anyway (but only for the initial uploading), would it be possible to upload textures to the GPU immediately from a given InputStream (of a .png file)? (I suppose this would save VM memory...)

So what I originally wanted:
[add Texture to TextureManager (as VM copy) -> apply texture to Object3D -> add Object3D to world -> upload texture to GPU -> rendering starts]
What I would want now:
[apply Texture PNG-InputStream to Object3D (no VM copy of the Texture) -> add Object3D to world -> upload texture to GPU using PNG-InputSteam -> rendering starts] (or perhaps a bitmap byte-inputstream if that works better...)
Basically what I want to achieve here is that I don't want a Texture to take up VM memory if the VM copy of the Texture gets disposed after uploading anyway...


AeroShark333

Euh, so instantly uploading textures to the GPU wouldn't be possible? With a Bytebuffer InputSteam from a Bitmap image then maybe?

I tried TextureManager.preWarm(fB); but it didn't seem to upload all Textures that I needed... Some were uploaded only when the first World.draw(fB); call was made