Author Topic: Could i hold the textures on GPU when the surface size changed?  (Read 3019 times)

Offline kiffa

  • long
  • ***
  • Posts: 199
    • View Profile
Could i hold the textures on GPU when the surface size changed?
« on: January 11, 2013, 06:59:52 am »
In my app, the window size will change at some time, and i coded below to handle this:

Code: [Select]

  // in class which implements Renderer
  public void onSurfaceChanged(GL10 gl, int width, int height)  {
    Log.d("jPCT-AE", "3d Renderer Surface Changed: " + width + ", " + height);
    Log.d("jPCT-AE", "3d Renderer Surface Changed: update frameBuffer");

    mFrameBuffer.dispose();
    initFrameBuffer(gl, width, height);
    getGameContext().setFrameBuffer(mFrameBuffer);
    initFpsShown();
   
    mGameLogic.onResize(mFrameBuffer.getWidth(), mFrameBuffer.getHeight());
  }

The code "mFrameBuffer.dispose();" will unload all the textures from GPU. There are 2 problems:

1, The action of unloading and reloading will cause a rendering delay.

2, I need hold all the textures in vm memory(Texture.defaultToKeepPixels(true);) or use the Virtualizer(it will cause more delay because of I/O, and can't help with the peak of memeory) .

Could i rebuild other contexts but hold the textures on GPU when the surface size changed?


Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: Could i hold the textures on GPU when the surface size changed?
« Reply #1 on: January 11, 2013, 08:13:46 am »
Could i rebuild other contexts but hold the textures on GPU when the surface size changed?
That's really simple: No, you can't. Context lost = textures lost.

Offline kiffa

  • long
  • ***
  • Posts: 199
    • View Profile
Re: Could i hold the textures on GPU when the surface size changed?
« Reply #2 on: January 11, 2013, 09:06:42 am »
I am not clear when will the OpenGL context  lose and what the system do when the context lose. Please give me some advice.

What i want to do is that:
When window size changed, the world will auto-scale to the right size but without unloading the textures from gpu because i don't want to the delay.

If i change code to an empty implement:

Code: [Select]
// in class which implements Renderer
  public void onSurfaceChanged(GL10 gl, int width, int height)  {
    Log.d("jPCT-AE", "3d Renderer Surface Changed: " + width + ", " + height);
    Log.d("jPCT-AE", "3d Renderer Surface Changed: update frameBuffer");
   
  }

When the window size changed(64*64 to 128*128), the world didn't scale to 128*128, but all are rendering fine.

WindowSize: 64*64 -> 128*128

1, Before change code: the world will scale to larger after a delay

2, After change code: the world won't scale, but rendering fine, and there is almost little delay.

I am not clear if the OpenGL context will lose when the window(surface) size changed?  Or only when the surface destroyed, the context  lost.

I tried this: GLSurfaceView.setVisiable(GONE); 

Then the view will gone and the surface will destroy(GLSurfaceView.onDestroyed()), the OpenGL context seems to be destroyed,  but when i "GLSurfaceView.setVisiable(VISABLE)", things happened like this:

GLSurfaceView.surfaceCreate() -> GLSurfaceView.surfaceChanged() -> Renderer.onSurfaceCreate()(the method has an empty implement) -> Renderer.onSurfaceChanged()(the method has an empty implement)  -> onDrawFrame()

And this situation looks like the one of "window size changed": the world didn't scale, but render fine. 

Full codes:

Code: [Select]
// in class which implements Renderer

public void onSurfaceCreated(GL10 gl, EGLConfig config)
{
    Log.d("jPCT-AE", "3d Renderer Surface Created");
}

 public void onSurfaceChanged(GL10 gl, int width, int height)
{
    Log.d("jPCT-AE", "3d Renderer Surface Changed: " + width + ", " + height);

    if(mFirstEntry)  // mFirstEntry wiil change to false in onDrawFrame
    {
      initFrameBuffer(gl, width, height);
      getGameContext().setFrameBuffer(mFrameBuffer);
      initFpsShown(); // rely on initFramebuffer
      mGameLogic.onCreate();
    }
//  else if(getGameContext().getGameConfig().canWindowScale())
//  {
//    Log.d("jPCT-AE", "3d Renderer Surface Changed: update frameBuffer");
//    mFrameBuffer.dispose();
//    initFrameBuffer(gl, width, height);
//    getGameContext().setFrameBuffer(mFrameBuffer);
//    initFpsShown();
//  }

    mGameLogic.onResize(mFrameBuffer.getWidth(), mFrameBuffer.getHeight());
}




Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: Could i hold the textures on GPU when the surface size changed?
« Reply #3 on: January 11, 2013, 08:29:26 pm »
What i'm doing is to check if the gl instance is the same as it was in the last call and if it isn't, i create a new FrameBuffer. But i guess that's not what you need. Desktop jPCT has a resize() method in FrameBuffer, which is missing from jPCT-AE, because i saw no need for this. Maybe i was wrong. What's this actual window size you are talking about? Where does this size come from (64*64 or 128*128 are by no means screen resolutions)?

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: Could i hold the textures on GPU when the surface size changed?
« Reply #4 on: January 11, 2013, 08:48:33 pm »
I've ported the resize()-method to jPCT-AE, but it's untested for now. You can get the version here: http://jpct.de/download/beta/jpct_ae.jar.

Offline kiffa

  • long
  • ***
  • Posts: 199
    • View Profile
Re: Could i hold the textures on GPU when the surface size changed?
« Reply #5 on: January 12, 2013, 12:31:03 pm »
The window size i talked about is the layout size of an android-top-level-view(For my game, it's a GLSurafaceView).

My project is not a full-screen-activity game, but a small-floating-window game,  the game window will display on top of all other windows. 64*64 means a window of 64 * 64pixels, the jpct-world is drawn in this window(also means a 64*64 surface and  a 64*64 frameBuffer).

When my game start, the phone-screen can split to 2 windows, one is the game-window, one is the launcher(home\desktop...)window, the game-window will overlay on the launcher window. And the users can see and operate both of them.

May be also some confused, sorry for my english, if needed, i will show an Screen-Print image soon. Some codes:

Code: [Select]
GLSurfaceView gameWindow = new GLSurfaceView();
LayoutParams params = new LayoutParams();
params.width = 64;
params.height = 64;
 // A new window will be added to the WMS to show, and will call Renderer.onSurfaceCreate() -> onSurfaceChange(64, 64) -> onDrawFrame()
WindowManager.addView(gameWindow, params);

// scale window to 128*128
LayoutParams params = getCurrentParames(gameWindow);
params.width = 128;
params.height = 128;
WindowManager.updateViewLayout(gameWindow, params);  // will cause Renderer.onSurfaceChanged(128, 128);


And thanks for your advices, i will check the gl instance for context change and try the new method.
« Last Edit: January 12, 2013, 12:47:06 pm by kiffa »

Offline kiffa

  • long
  • ***
  • Posts: 199
    • View Profile
Re: Could i hold the textures on GPU when the surface size changed?
« Reply #6 on: January 14, 2013, 05:02:52 am »
The FrameBuffer.resize() method is just what i needed! Thanks!

 And when will you add this to an new version jar?

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: Could i hold the textures on GPU when the surface size changed?
« Reply #7 on: January 14, 2013, 09:39:05 am »
And when will you add this to an new version jar?
It's in the beta and it still stay there. You are save to use the beta version until the next official release comes up.