IRenderer cannot be resolved to a variable

Started by nima.sh23, August 26, 2015, 07:25:08 AM

Previous topic - Next topic

nima.sh23

Hello
I use of last jpct library for android project.
I want use of Texture splatting,but I have some problem.
I can't use IRenderer Interface for "IRenderer.RENDERER_SOFTWARE" and "IRenderer.RENDERER_OPENGL" and also framebuffer does not has  SAMPLINGMODE_HARDWARE_ONLY property.
Thanks

EgonOlsen

That's because that example is for desktop jPCT, for jPCT-AE the idea remains the same, but the setup code is of course different (just like usual jPCT-AE project setup works) and the shaders have to be modified so that they are using jPCT-AE's automagically injected values instead of the ones from desktop OpenGL. 

nima.sh23

Thanks for your comment.
It's mean I can't use of that on Android applications?

EgonOlsen

You can use the basic idea but not the specific source code example. I have some splatting shaders for Android...I'll post that later.

EgonOlsen

Ok, here you go. The basic idea remains the same: You assign four textures to an object, the first 3 are the actual texture maps to blend the third one is the blending map. You can optimize this approach by storing the actual blending data in the alpha channel of each texture instead, but I didn't do that here. Remember to adjust http://www.jpct.net/jpct-ae/doc/com/threed/jpct/Config.html#maxTextureLayers to 4 for Android for this to work.
Then you assign a custom shader to your object. This one is taken from my RPG project. It supports one light source, fogging and in addition to the 3 base textures, it uses another, darkened version of one of those.

Vertex shader:

attribute vec2 texture0;
attribute vec2 texture1;
attribute vec2 texture2;
attribute vec2 texture3;

uniform vec4 additionalColor;
uniform vec4 ambientColor;

uniform mat4 modelViewMatrix;
uniform mat4 modelViewProjectionMatrix;
uniform mat4 textureMatrix;

uniform float fogStart;
uniform float fogEnd;
uniform vec3 fogColor;

uniform vec3 lightPositions[8];
uniform vec3 diffuseColors[8];

attribute vec4 position;
attribute vec3 normal;

varying vec2 texCoord0;
varying vec2 texCoord1;
varying vec2 texCoord2;
varying vec2 texCoord3;

varying vec4 vertexColor;
varying vec4 fogVertexColor;

const vec4 WHITE = vec4(1.0,1.0,1.0,1.0);

void main(void)
{
texCoord0=texture0;
texCoord1=texture1*0.5;
texCoord2=texture2;
texCoord3=texture3;

vertexColor = additionalColor+ambientColor;

vec3 vVertex = vec3(modelViewMatrix * position);
vec3 normalEye   = normalize(modelViewMatrix * vec4(normal, 0.0)).xyz;
float angle = dot(normalEye, normalize(lightPositions[0] - vVertex));

if (angle > 0.0) {
vertexColor += vec4(diffuseColors[0] * angle, 0.0);
}

vertexColor=2.0*vec4(min(WHITE, vertexColor).xyz, 0.5);

float fogWeight = clamp((-vVertex.z - fogStart) / (fogEnd - fogStart), 0.0, 1.0);
fogVertexColor = vec4(fogColor, 0.0) * fogWeight;
fogWeight=1.0-fogWeight;

vertexColor*=fogWeight;
gl_Position = modelViewProjectionMatrix * position;
}


Fragment shader:

precision highp float;

uniform sampler2D textureUnit0;
uniform sampler2D textureUnit1;
uniform sampler2D textureUnit2;
uniform sampler2D textureUnit3;

varying vec2 texCoord0;
varying vec2 texCoord1;
varying vec2 texCoord2;
varying vec2 texCoord3;

varying vec4 vertexColor;
varying vec4 fogVertexColor;

void main (void)
{
vec4 col0 = texture2D(textureUnit0, texCoord0);
vec4 col1 = texture2D(textureUnit1, texCoord1);
vec4 col2 = texture2D(textureUnit2, texCoord2);
vec4 col3 = col2*0.4;
vec4 blend = texture2D(textureUnit3, texCoord3);

//gl_FragColor = fogWeight * (vertexColor*(vec4(col0.rgb*(1.0-w1)*(1.0-w2)*(1.0-w3),0.0)+vec4(col1.rgb*w1,0.0)+vec4(col2.rgb*w2,0.0)+vec4(col3.rgb*w3,0.0))) + fogVertexColor;
//gl_FragColor = vertexColor * vec4(((1.0-w1)*(1.0-w2)*(1.0-w3)*col0+col1*w1+col2*w2+col3*w3).rgb, 0.0) + fogVertexColor;

//col0 = mix(mix(mix(col0, col1, w1), col2, w2), col3, w3);
//col0.a=0.0;
gl_FragColor = vertexColor *  mix(mix(mix(col0, col1, blend.r), col2, blend.g), col3, blend.b) + fogVertexColor;
}

nima.sh23

#5
Thanks for replay
I've read this link:
http://www.jpct.net/wiki/index.php?title=Texture_splatting_on_a_terrain
and now I need same code for android.

EgonOlsen

It's basically the exact same thing. The only real difference is in the shaders and I've already posted some example shaders for Android that should get you started. They might not be exactly what you need, but then again no other example shader would be either. So you have to modify them anyway...

nima.sh23

Thanks a lot for your answer.
My problem is those method does not exist on FrameBuffer.

                buffer = new FrameBuffer(1024, 768, FrameBuffer.SAMPLINGMODE_HARDWARE_ONLY);
      buffer.disableRenderer(IRenderer.RENDERER_SOFTWARE);
      buffer.enableRenderer(IRenderer.RENDERER_OPENGL);

In first line of above code,I cant pass third parameter and on second and third line FreameBuffer does not has disableRenderer and enableRenderer methods.
Also FrameBuffer does not has Update and displayGLOnly methods.
It's mean JPCT-AE Does not support these methods?

EgonOlsen

Just take the a basic jPCT-AE example to see how it's done there. There's no copy-and-paste solution to your question. Without a basic understanding of how jPCT-AE works, you won't get far. Once you have that, it's really easy to port desktop jPCT stuff to Android.

nima.sh23

I understand JPCT how to work, I know this sample for JPCT desktop and also I use of Jpct-AE.
I just have a simple question:
How can I use texture splatting in JPCT-AE?

EgonOlsen

Quote from: nima.sh23 on September 13, 2015, 11:33:11 AM
I just have a simple question:
How can I use texture splatting in JPCT-AE?
And I actually gave you the answer already: By doing the same thing that the example for the desktop does. The only difference is the shader, but I gave you some shaders to start with as well...whatever, here's a basic example:


package com.threed.jpct.examples.texturesplat;

import java.lang.reflect.Field;

import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;

import android.app.Activity;
import android.opengl.GLSurfaceView;
import android.os.Bundle;
import android.view.MotionEvent;

import com.threed.jpct.Camera;
import com.threed.jpct.Config;
import com.threed.jpct.FrameBuffer;
import com.threed.jpct.GLSLShader;
import com.threed.jpct.Light;
import com.threed.jpct.Loader;
import com.threed.jpct.Logger;
import com.threed.jpct.Object3D;
import com.threed.jpct.PolygonManager;
import com.threed.jpct.RGBColor;
import com.threed.jpct.SimpleVector;
import com.threed.jpct.Texture;
import com.threed.jpct.TextureInfo;
import com.threed.jpct.TextureManager;
import com.threed.jpct.World;
import com.threed.jpct.util.MemoryHelper;

/**
*
* @author EgonOlsen
*
*/
public class TextureSplat extends Activity {

// Used to handle pause and resume...
private static TextureSplat master = null;

private GLSurfaceView mGLView;
private MyRenderer renderer = null;
private FrameBuffer fb = null;
private World world = null;
private RGBColor back = new RGBColor(50, 50, 100);

private float touchTurn = 0;
private float touchTurnUp = 0;

private float xpos = -1;
private float ypos = -1;

private Object3D terrain = null;
private GLSLShader splatter = null;
private int fps = 0;

protected void onCreate(Bundle savedInstanceState) {

Config.maxTextureLayers = 4;
Config.glTrilinear = true;
Texture.defaultToMipmapping(true);

if (master != null) {
copy(master);
}

super.onCreate(savedInstanceState);
mGLView = new GLSurfaceView(getApplication());
mGLView.setEGLContextClientVersion(2);
renderer = new MyRenderer();
mGLView.setRenderer(renderer);
setContentView(mGLView);
}

@Override
protected void onPause() {
super.onPause();
mGLView.onPause();
}

@Override
protected void onResume() {
super.onResume();
mGLView.onResume();
}

@Override
protected void onStop() {
super.onStop();
}

private void copy(Object src) {
try {
Logger.log("Copying data from master Activity!");
Field[] fs = src.getClass().getDeclaredFields();
for (Field f : fs) {
f.setAccessible(true);
f.set(this, f.get(src));
}
} catch (Exception e) {
throw new RuntimeException(e);
}
}

public boolean onTouchEvent(MotionEvent me) {

if (me.getAction() == MotionEvent.ACTION_DOWN) {
xpos = me.getX();
ypos = me.getY();
return true;
}

if (me.getAction() == MotionEvent.ACTION_UP) {
xpos = -1;
ypos = -1;
touchTurn = 0;
touchTurnUp = 0;
return true;
}

if (me.getAction() == MotionEvent.ACTION_MOVE) {
float xd = me.getX() - xpos;
float yd = me.getY() - ypos;

xpos = me.getX();
ypos = me.getY();

touchTurn = xd / -100f;
touchTurnUp = yd / -100f;
return true;
}

try {
Thread.sleep(15);
} catch (Exception e) {
// No need for this...
}

return super.onTouchEvent(me);
}

protected boolean isFullscreenOpaque() {
return true;
}

class MyRenderer implements GLSurfaceView.Renderer {

private long time = System.currentTimeMillis();

public MyRenderer() {
}

public void onSurfaceChanged(GL10 gl, int w, int h) {
if (fb != null) {
fb.dispose();
}

fb = new FrameBuffer(w, h); // OpenGL ES 2.0 constructor

if (master == null) {

try {

TextureManager tm = TextureManager.getInstance();
tm.addTexture("grass", new Texture(getResources().getAssets().open("grass.jpg")));
tm.addTexture("sand", new Texture(getResources().getAssets().open("sand.jpg")));
tm.addTexture("rocks", new Texture(getResources().getAssets().open("rocks.jpg")));
tm.addTexture("splat", new Texture(getResources().getAssets().open("splat2.png")));

terrain = Object3D.mergeAll(Loader.loadOBJ(getResources().getAssets().open("terrain.obj"), getResources().getAssets().open("terrain.mtl"), 1));
terrain.rotateX(-(float) Math.PI / 2f);
terrain.rotateMesh();
terrain.clearRotation();
setTexture(terrain);
terrain.compile();
terrain.build();

splatter = new GLSLShader(Loader.loadTextFile(getResources().getAssets().open("splat.vert")),
Loader.loadTextFile(getResources().getAssets().open("splat.frag")));

terrain.setShader(splatter);

world = new World();
world.setClippingPlanes(1, 15000);
world.addObject(terrain);
world.setAmbientLight(60, 60, 60);

Light light = new Light(world);
light.setPosition(new SimpleVector(500, -4000, -2000));
light.setAttenuation(-1);
light.setIntensity(200, 255, 255);

Camera camera = world.getCamera();
camera.setPosition(0, -3500, -500);
camera.lookAt(terrain.getTransformedCenter());
MemoryHelper.compact();

if (master == null) {
Logger.log("Saving master Activity!");
master = TextureSplat.this;
}
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}

/**
* This method generates texture coordinates for the mesh based on the
* coordinates in object space. The normal texture layers are tiled
* while the splatting texture (which is the last layer) covers the mesh
* exactly.
*
* @param obj
*/
private void setTexture(Object3D obj) {
TextureManager tm = TextureManager.getInstance();

terrain.calcBoundingBox();

float[] bb = terrain.getMesh().getBoundingBox();

float minX = bb[0];
float maxX = bb[1];
float minZ = bb[4];
float maxZ = bb[5];
float dx = maxX - minX;
float dz = maxZ - minZ;

float dxs = dx;
float dzs = dz;

dx /= 200f;
dz /= 200f;

float dxd = dx;
float dzd = dz;

int tid = tm.getTextureID("grass");
int sid = tm.getTextureID("rocks");
int trid = tm.getTextureID("sand");
int bid = tm.getTextureID("splat");

PolygonManager pm = terrain.getPolygonManager();
for (int i = 0; i < pm.getMaxPolygonID(); i++) {
SimpleVector v0 = pm.getTransformedVertex(i, 0);
SimpleVector v1 = pm.getTransformedVertex(i, 1);
SimpleVector v2 = pm.getTransformedVertex(i, 2);

// Assign textures for the first three layers (the "normal"
// textures)...
TextureInfo ti = new TextureInfo(tid, v0.x / dx, v0.z / dz, v1.x / dx, v1.z / dz, v2.x / dx, v2.z / dz);
ti.add(sid, v0.x / dxd, v0.z / dzd, v1.x / dxd, v1.z / dzd, v2.x / dxd, v2.z / dzd, TextureInfo.MODE_ADD);
ti.add(trid, v0.x / dxd, v0.z / dzd, v1.x / dxd, v1.z / dzd, v2.x / dxd, v2.z / dzd, TextureInfo.MODE_ADD);

// Assign the splatting texture...
ti.add(bid, -(v0.x - minX) / dxs, (v0.z - minZ) / dzs, -(v1.x - minX) / dxs, (v1.z - minZ) / dzs, -(v2.x - minX) / dxs, (v2.z - minZ) / dzs, TextureInfo.MODE_ADD);
pm.setPolygonTexture(i, ti);
}
}

public void onSurfaceCreated(GL10 gl, EGLConfig config) {
}

public void onDrawFrame(GL10 gl) {
if (touchTurn != 0) {
terrain.rotateY(touchTurn);
touchTurn = 0;
}

if (touchTurnUp != 0) {
terrain.rotateX(touchTurnUp);
touchTurnUp = 0;
}

fb.clear(back);
world.renderScene(fb);
world.draw(fb);
fb.display();

if (System.currentTimeMillis() - time >= 1000) {
Logger.log(fps + "fps");
fps = 0;
time = System.currentTimeMillis();
}
fps++;
}
}
}


Fragment shader

precision mediump float;

uniform sampler2D textureUnit0;
uniform sampler2D textureUnit1;
uniform sampler2D textureUnit2;
uniform sampler2D textureUnit3;

uniform int textureCount;
uniform int blendingMode[4];

varying vec2 texCoord0;
varying vec2 texCoord1;
varying vec2 texCoord2;
varying vec2 texCoord3;
varying vec4 vertexColor;
varying float fogWeight;
varying vec3 fogVertexColor;

const vec4 WHITE = vec4(1,1,1,1);

void main() {
vec4 col0 = texture2D(textureUnit0, texCoord0);
vec4 col1 = texture2D(textureUnit1, texCoord1);
vec4 col2 = texture2D(textureUnit2, texCoord2);
vec4 col3 = col2*0.4;
vec4 blend = texture2D(textureUnit3, texCoord3);

vec4 col = vertexColor *  mix(mix(mix(col0, col1, blend.r), col2, blend.g), col3, blend.b);

if (fogWeight>-0.9) {
col.xyz = (1.0-fogWeight) * col.xyz + fogVertexColor;
}

gl_FragColor =  col;
}


Vertex shader

uniform mat4 modelViewMatrix;
uniform mat4 modelViewProjectionMatrix;
uniform mat4 textureMatrix;

uniform vec4 additionalColor;
uniform vec4 ambientColor;

uniform float alpha;
uniform float shininess;
uniform bool useColors;

uniform float fogStart;
uniform float fogEnd;
uniform vec3 fogColor;

uniform int lightCount;

uniform vec3 lightPositions[8];
uniform vec3 diffuseColors[8];
uniform vec3 specularColors[8];
uniform float attenuation[8];

attribute vec4 position;
attribute vec3 normal;
attribute vec4 color;
attribute vec2 texture0;
attribute vec2 texture1;
attribute vec2 texture2;
attribute vec2 texture3;

varying vec2 texCoord0;
varying vec2 texCoord1;
varying vec2 texCoord2;
varying vec2 texCoord3;
varying vec4 vertexColor;
varying vec3 fogVertexColor;
varying float fogWeight;

const vec4 WHITE = vec4(1,1,1,1);

void main() {

texCoord0 = (textureMatrix * vec4(texture0, 0, 1)).xy;
texCoord1 = texture1;
texCoord2 = texture2;
texCoord3 = texture3;

vec4 vertexPos = modelViewMatrix * position;
vertexColor = ambientColor + additionalColor;

if (lightCount>0) {
// This is correct only if the modelview matrix is orthogonal. In jPCT-AE, it always is...unless you fiddle around with it.
vec3 normalEye   = normalize(modelViewMatrix * vec4(normal, 0.0)).xyz;

float angle = dot(normalEye, normalize(lightPositions[0] - vertexPos.xyz));

if (angle > 0.0) {
vertexColor += vec4((diffuseColors[0] * angle + specularColors[0] * pow(angle, shininess))*(1.0/(1.0+length(lightPositions[0] - vertexPos.xyz)*attenuation[0])), 1);
}

// Freaky Adreno shader compiler can't handle loops without locking or creating garbage results....this is why the
// loop has been unrolled here. It's faster this way on PowerVR SGX540 too, even if PVRUniSCoEditor says otherwise...

if (lightCount>1) {
angle = dot(normalEye, normalize(lightPositions[1] - vertexPos.xyz));

if (angle > 0.0) {
vertexColor += vec4((diffuseColors[1] * angle + specularColors[1] * pow(angle, shininess))*(1.0/(1.0+length(lightPositions[1] - vertexPos.xyz)*attenuation[1])), 1);
}

if (lightCount>2) {
angle = dot(normalEye, normalize(lightPositions[2] - vertexPos.xyz));

if (angle > 0.0) {
vertexColor += vec4((diffuseColors[2] * angle + specularColors[2] * pow(angle, shininess))*(1.0/(1.0+length(lightPositions[2] - vertexPos.xyz)*attenuation[2])), 1);
}

if (lightCount>3) {
angle = dot(normalEye, normalize(lightPositions[3] - vertexPos.xyz));

if (angle > 0.0) {
vertexColor += vec4((diffuseColors[3] * angle + specularColors[3] * pow(angle, shininess))*(1.0/(1.0+length(lightPositions[3] - vertexPos.xyz)*attenuation[3])), 1);
}

if (lightCount>4) {
angle = dot(normalEye, normalize(lightPositions[4] - vertexPos.xyz));

if (angle > 0.0) {
vertexColor += vec4((diffuseColors[4] * angle + specularColors[4] * pow(angle, shininess))*(1.0/(1.0+length(lightPositions[4] - vertexPos.xyz)*attenuation[4])), 1);
}

if (lightCount>5) {
angle = dot(normalEye, normalize(lightPositions[5] - vertexPos.xyz));

if (angle > 0.0) {
vertexColor += vec4((diffuseColors[5] * angle + specularColors[5] * pow(angle, shininess))*(1.0/(1.0+length(lightPositions[5] - vertexPos.xyz)*attenuation[5])), 1);
}

if (lightCount>6) {
angle = dot(normalEye, normalize(lightPositions[6] - vertexPos.xyz));

if (angle > 0.0) {
vertexColor += vec4((diffuseColors[6] * angle + specularColors[6] * pow(angle, shininess))*(1.0/(1.0+length(lightPositions[6] - vertexPos.xyz)*attenuation[6])), 1);
}
if (lightCount>7) {
angle = dot(normalEye, normalize(lightPositions[7] - vertexPos.xyz));

if (angle > 0.0) {
vertexColor += vec4((diffuseColors[7] * angle + specularColors[7] * pow(angle, shininess))*(1.0/(1.0+length(lightPositions[7] - vertexPos.xyz)*attenuation[7])), 1);
}
}
}
}
}
}
}
}
}


if (fogStart != -1.0) {
fogWeight = clamp((-vertexPos.z - fogStart) / (fogEnd - fogStart), 0.0, 1.0);
fogVertexColor = fogColor * fogWeight;
} else {
fogWeight = -1.0;
}

vertexColor=vec4(min(WHITE, vertexColor).xyz, alpha);

if (useColors) {
vertexColor *= color;
}

gl_Position = modelViewProjectionMatrix * position;
}


The assets are the same ones that the example in the wiki is using. The shaders are modified default shaders of jPCT-AE. Depending on your scene, they might not be optimal because they cover the most general case, but anway...