State of OpenGL ES2.0 support

Started by EgonOlsen, May 13, 2011, 11:17:08 PM

Previous topic - Next topic

Disastorm

#60
Hello what is the advantage of ES2.0 other than ability to anti alias?  Is it better in terms of performance or memory?  I guess it also allows shaders whereas the 1.1 didn't (I'm just guessing that since you posted a bunch of shader demos)?

EgonOlsen

Quote from: Disastorm on July 20, 2011, 10:31:11 AM
Hello what is the advantage of ES2.0 other than ability to anti alias?  Is it better in terms of performance or memory?  I guess it also allows shaders whereas the 1.1 didn't (I'm just guessing that since you posted a bunch of shader demos)?
No, it's not better in terms of performance (albeit you can write your own, customized shaders that might improve some special cases) and not in terms of memory (it requires the memory for storing the shaders in addition, which isn't much though). It allows for AA and shaders...both features aren't available in 1.x. If it's worth it, depends on your needs.

Thomas.

#62
any news about release next alpha with antialiasing? :)

EgonOlsen

Can do it anytime...i'll try to fix Disastorm's issue first and then it will come with that fix anyway.

EgonOlsen

New alpha has been uploaded. See "new version"-thread for details.

Thomas.

#65
no slowdown FPS with AA in my game :) ... upper image is without AA and lower image is with, is it normal for blitted textures?


EgonOlsen

#66
Quote from: Thomas. on July 23, 2011, 10:59:28 PM
no slowdown FPS with AA in my game :) ... upper image is without AA and lower image is with, is it normal for blitted textures?

Yes. That's what i meant with:

Quote
...because it also shows some effect on texture content (i.e. it somehow blurs transparent edges (fps counter) for example)...

Edit: But it looks like as if you can get rid of this, if you disable filtering on the textures that are used for blitting.

Thomas.


Thomas.

how can I apply shaders like are depth of field or per pixel light? I will try find some on the Internet and test how the application will look like...

EgonOlsen

#69
It's not THAT easy, i'm afraid. First of all, depth of field is a post processing effect. To apply a post processing effect in form of a shader, you have to render the whole scene into a texture and then onto a quad to display it, because neither in the fragment nor in the vertex shader you do have access to the framebuffer's data itself. At the moment, you can't really do this. You would need support for frame buffer objects (fbos) and maybe render the scene into different fbos splitted into color and depth buffer and such. Support for this is on my list, but not implemented yet.

Everything else (i.e. "normal" shader code that isn't used for full screen effects) is possible. However, as i've mentioned in my release-post, it..."lacks some documentation". What i meant with that is, that the engine has to know where to put which data into the shader, or otherwise, you wouldn't be able to access the vertices, normals, lights etc. in your shader. It does this by making you follow a naming convention when you write your shader and i haven't documented this naming yet...i'll do briefly now. The idea is this: If jPCT-AE finds any of the following uniforms or attributes in either shader part (vertex/fragment), it'll put the corresponding engine data in it. If it doesn't find one, it'll simply skip that data. Ok, here we go:

uniform mat4 modelViewMatrix; - The model view matrix, i.e. the matrix that makes the transformation from object into camera space
uniform mat4 modelViewProjectionMatrix; - The model view projection matrix, i.e. the model view matrix * project matrix
uniform mat4 textureMatrix; - The texture matrix

uniform vec4 additionalColor; -  An object's additional color (stored in .rgb of the uniform).
uniform vec4 ambientColor; -  The world's ambient color (stored in .rgb of the uniform).

uniform float alpha; - The alpha value of the object.
uniform float shininess; - The shininess if specular lighting is being used.
uniform bool useColors; -  true, if the object contains additional vertex colors.

uniform float fogStart; - The depth coordinate, at which the fog starts. -1.0 if there is no fog.
uniform float fogEnd; - The depth coordinate of maximum fog.
uniform vec3 fogColor; - The fog color.

uniform int lightCount; - The number of lights that light this object.

uniform vec3 lightPositions[8]; - The light positions in camera space.
uniform vec3 diffuseColors[8]; - The diffuse color of each light source.
uniform vec3 specularColors[8]; - The specular color of each light source.

uniform sampler2D textureUnit0; - The texture sampler for the first texture layer.
uniform sampler2D textureUnit1; - The texture sampler for the second texture layer.
uniform sampler2D textureUnit2; - The texture sampler for the third texture layer.
uniform sampler2D textureUnit3; - The texture sampler for the forth texture layer.

uniform int textureCount; - The number of texture layers.
uniform int blendingMode[4]; - The blending modes between these layers (0==MODULATE, 1==ADD, 2==REPLACE, 3==BLEND).

attribute vec4 position; - The vertex positions in object space.
attribute vec3 normal; - The vertex normals in object space.
attribute vec4 color; - The additional vertex colors (if any).
attribute vec4 tangent; - The tangent vectors. The presence of this attribute in a shader's source code will trigger the tangent vector calculation when calling build().
attribute vec2 texture0; - The texture coordinates for the first stage. 
attribute vec2 texture1; - The texture coordinates for the second stage.
attribute vec2 texture2; - The texture coordinates for the third stage.
attribute vec2 texture3; - The texture coordinates for the forth stage.

Any additional uniforms can be given to the shader via the methods in GLSLShader. Currently, it's lacking the option to feed additional vertex attributes into the shader, but for most cases, this shouldn't be needed. If it is, consider to store them in a texture instead.

As you can see, you can't simply copy and paste shaders from the internet, because you have to make them use this naming convention. In addition, the least shaders that can be found are written for OpenGL ES. Most are written for "normal" OpenGL...and "normal" OpenGL is different in a way that it already has access to some pre-defined attributes and uniforms like gl_NormalMatrix, gl_Normal, gl_Vertex, gl_LightSource etc.... If you want to port such a shader, you first have to find out which pre-defined attribute/uniform matches which in jPCT-AE or maybe you have to set it from the outside in your code.

As said, you'll find the default shader set in the jar and an example shader of a different kind in the HelloShader-example.






Thomas.

#70
Thanks, I never wrote any shaders, so I start with some tutorials. Have you any advice where should I look?

EgonOlsen

I'm not sure...any shader tutorial for OpenGL ES should do, as long as it's good. It doesn't have to be Android specific. You'll most likely have more luck with tutorials for the iPhone. The syntax is the same...just keep in mind that on iPhone, there's only PowerVR while on Android, a lot of different implementations exist.

Thomas.

Are you planning to implement pixel light and spot light into engine? I think that more people will use it :)

EgonOlsen

I might add some shaders of that kind, but i'm not going to add it as a core functionality.

Thomas.

#74
I copy fragment point light from this page, but boxes are black. Is it ok?

uniform mat4 modelViewProjectionMatrix;
uniform mat4 modelViewMatrix;

attribute vec4 position;
attribute vec4 color;
attribute vec3 normal;

varying vec3 v_Position;
varying vec4 v_Color;
varying vec3 v_Normal;

void main()
{
    v_Position = vec3(modelViewMatrix * position);
    v_Color = color;
    v_Normal = vec3(modelViewMatrix * vec4(normal, 0.0));
    gl_Position = modelViewProjectionMatrix * position;
}


precision mediump float;
uniform vec3 lightPositions[8];
varying vec3 v_Position;
varying vec4 v_Color;
varying vec3 v_Normal;

void main()
{
    float distance = length(lightPositions[0] - v_Position);
    vec3 lightVector = normalize(lightPositions[0] - v_Position);
    float diffuse = max(dot(v_Normal, lightVector), 0.1);
    diffuse = diffuse * (1.0 / (1.0 + (0.25 * distance * distance)));
    gl_FragColor = v_Color * diffuse;
}


package cz.test.point_light;

import java.lang.reflect.Field;

import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;

import android.app.Activity;
import android.content.pm.ActivityInfo;
import android.content.res.Resources;
import android.opengl.GLSurfaceView;
import android.os.Bundle;
import android.view.MotionEvent;
import android.view.Window;
import android.view.WindowManager;
import android.view.WindowManager.LayoutParams;

import com.threed.jpct.Camera;
import com.threed.jpct.FrameBuffer;
import com.threed.jpct.GLSLShader;
import com.threed.jpct.Light;
import com.threed.jpct.Loader;
import com.threed.jpct.Logger;
import com.threed.jpct.Object3D;
import com.threed.jpct.Primitives;
import com.threed.jpct.RGBColor;
import com.threed.jpct.SimpleVector;
import com.threed.jpct.Texture;
import com.threed.jpct.TextureManager;
import com.threed.jpct.World;
import com.threed.jpct.util.AAConfigChooser;
import com.threed.jpct.util.BitmapHelper;
import com.threed.jpct.util.MemoryHelper;

/**
* @author EgonOlsen
*
*/
public class Point_light_testActivity extends Activity {

private static Point_light_testActivity master = null;

private GLSurfaceView mGLView;
private MyRenderer renderer = null;
private FrameBuffer fb = null;
private World world = null;
private RGBColor back = new RGBColor(50, 50, 100);

private float touchTurn = 0;
private float touchTurnUp = 0;

private float xpos = -1;
private float ypos = -1;

private Object3D cube0 = null;
private Object3D cube1 = null;
private Object3D cube2 = null;
private Object3D cube3 = null;
private Object3D dummy = null;

private int fps = 0;

private Light light;

protected void onCreate(Bundle savedInstanceState) {

Logger.log("onCreate");

if (master != null) {
copy(master);
}

requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(LayoutParams.FLAG_FULLSCREEN, LayoutParams.FLAG_FULLSCREEN);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON, WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);

super.onCreate(savedInstanceState);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
mGLView = new GLSurfaceView(getApplication());

super.onCreate(savedInstanceState);
mGLView = new GLSurfaceView(getApplication());

mGLView.setEGLContextClientVersion(2);
mGLView.setEGLConfigChooser(new AAConfigChooser(mGLView));

renderer = new MyRenderer();
mGLView.setRenderer(renderer);
setContentView(mGLView);
}

@Override
protected void onPause() {
super.onPause();
mGLView.onPause();
}

@Override
protected void onResume() {
super.onResume();
mGLView.onResume();
}

protected void onStop() {
super.onStop();
}

private void copy(Object src) {
try {
Logger.log("Copying data from master Activity!");
Field[] fs = src.getClass().getDeclaredFields();
for (Field f : fs) {
f.setAccessible(true);
f.set(this, f.get(src));
}
} catch (Exception e) {
throw new RuntimeException(e);
}
}

public boolean onTouchEvent(MotionEvent me) {

if (me.getAction() == MotionEvent.ACTION_DOWN) {
xpos = me.getX();
ypos = me.getY();
return true;
}

if (me.getAction() == MotionEvent.ACTION_UP) {
xpos = -1;
ypos = -1;
touchTurn = 0;
touchTurnUp = 0;
return true;
}

if (me.getAction() == MotionEvent.ACTION_MOVE) {
float xd = me.getX() - xpos;
float yd = me.getY() - ypos;

xpos = me.getX();
ypos = me.getY();

touchTurn = xd / -100f;
touchTurnUp = yd / -100f;
return true;
}

try {
Thread.sleep(15);
} catch (Exception e) {
// No need for this...
}

return super.onTouchEvent(me);
}

protected boolean isFullscreenOpaque() {
return true;
}

class MyRenderer implements GLSurfaceView.Renderer {

private long time = System.currentTimeMillis();
private boolean stop = false;

public MyRenderer() {
}

public void stop() {
stop = true;
}

public void onSurfaceChanged(GL10 gl, int w, int h) {
if (fb != null) {
fb.dispose();
}
fb = new FrameBuffer(w, h);

if (master == null) {

world = new World();
world.setAmbientLight(0, 0, 0);

Texture texture = new Texture(BitmapHelper.rescale(BitmapHelper.convert(getResources().getDrawable(R.drawable.icon)), 64,
64));
TextureManager.getInstance().addTexture("texture", texture);

dummy = Object3D.createDummyObj();

cube0 = Primitives.getCube(10);
cube0.rotateY(-(float) Math.PI / 4f);
cube0.rotateMesh();
cube0.clearRotation();
cube0.calcTextureWrapSpherical();
cube0.setTexture("texture");
cube0.strip();
cube0.build();

cube1 = cube0.cloneObject();
cube2 = cube0.cloneObject();
cube3 = cube0.cloneObject();

world.addObject(cube0);
world.addObject(cube1);
world.addObject(cube2);
world.addObject(cube3);

cube0.translate(-20, -20, 0);
cube1.translate(20, -20, 0);
cube2.translate(-20, 20, 0);
cube3.translate(20, 20, 0);

cube0.addParent(dummy);
cube1.addParent(dummy);
cube2.addParent(dummy);
cube3.addParent(dummy);

Object3D plane = Primitives.getPlane(40, 3);
plane.translate(0, 0, 10);
plane.build();
world.addObject(plane);

Resources res = getResources();

GLSLShader pointLight = new GLSLShader(Loader.loadTextFile(res.openRawResource(R.raw.vertex_point_light)),
Loader.loadTextFile(res.openRawResource(R.raw.fragment_point_light)));

cube0.setShader(pointLight);
cube1.setShader(pointLight);
cube2.setShader(pointLight);
cube3.setShader(pointLight);
//plane.setShader(pointLight);

light = new Light(world);
light.setPosition(new SimpleVector(20, 0, -1));

light.setIntensity(255, 255, 255);

Camera cam = world.getCamera();
cam.moveCamera(Camera.CAMERA_MOVEOUT, 100);

MemoryHelper.compact();

if (master == null) {
Logger.log("Saving master Activity!");
master = Point_light_testActivity.this;
}

// Logger.setLogLevel(Logger.DEBUG);
}
}

public void onSurfaceCreated(GL10 gl, EGLConfig config) {
}

public void onDrawFrame(GL10 gl) {

try {
if (!stop) {
if (touchTurn != 0) {
dummy.rotateY(touchTurn);
touchTurn = 0;
}

if (touchTurnUp != 0) {
dummy.rotateX(touchTurnUp);
touchTurnUp = 0;
}

light.rotate(new SimpleVector(0, 0, 0.01), new SimpleVector());

fb.clear(back);
world.renderScene(fb);
world.draw(fb);
fb.display();

if (System.currentTimeMillis() - time >= 1000) {
Logger.log(fps + "fps");
fps = 0;
time = System.currentTimeMillis();
}
fps++;
} else {
if (fb != null) {
fb.dispose();
fb = null;
}
}
} catch (Exception e) {
Logger.log(e, Logger.MESSAGE);
}
}
}
}