Normal Map for Texture

Started by subhalakshmi27, January 27, 2012, 04:23:31 PM

Previous topic - Next topic

subhalakshmi27

Hi All,

My graphic designer likes to know if she can use Normal maps for the texture. Is it possible to use the texture which has used normal map?

Thanks,
Subha

EgonOlsen

Yes, you can use normal mapping when using the OpenGL ES 2.0 support (http://www.jpct.net/wiki/index.php/OpenGL_ES_2.0_support), but you have to write your own vertex and fragment shaders for this. The distribution contains an example for parallax mapping (a kind of advanced normal mapping) in the HelloShader example, which is a form of advanced normal mapping which uses an additional height map.

However, keep in mind that shaders aren't as fast on mobile devices as the fixed function pipeline in many cases. It requires a lot of time and afford to tweak them in a way that they run fast and correct across all devices.

Varanda

Hello Egon,

About

Quote
keep in mind that shaders aren't as fast on mobile devices as the fixed function pipeline in many cases

Before I create/bake my normalmaps is your statements still valid for the 2014 and beyond devices?
I was considering to use NormalMaps for provide depth for all doors/windows for the building that I am modeling. Would rather be better to use geometry for them?


EgonOlsen

It's still kind of valid, but it doesn't matter, because performance is usually high enough anyway. Just go with normal maps. But you have to write your own shader to use them.

Varanda

Thanks... will look at a GLSL code for that. BTW... the waving bushes, flowers, tree-branches on Naroth was animated or was implemented on their shades? On Distortion game, a game that I collaborated in the past, we used a proper shader for waving (we used the former C4Engine ).  If so would you have a sample code to point me out? Thanks.

EgonOlsen

The plants and trees are using a simple shader. It just detects the current uv-coordinates of the vertex and if it's within a certain range, it applies some simple sin/cos calculations (based on a timer) on it. What example do you need? One for the waving plants or one for the normal mapping stuff?

Varanda

Both would be welcome  ;D
However, I do not want you to waist much of your time.

Those effects are desired. However, I need to publish this game ASAP in order to take advantage or the political momentum.
If it will demand too much investigation/learning then I may postpone these effects to a rev 2.

Thanks,
Marcelo

EgonOlsen

I usually don't use pure normal mapping, but offset mapping. But I found this in my source tree...it might be a start:

Normal mapping fragment shader:

precision mediump float;

varying vec3 lightVec[2];
varying vec3 eyeVec;
varying vec2 texCoord;

uniform sampler2D textureUnit0;
uniform sampler2D textureUnit1;

uniform vec3 diffuseColors[8];
uniform vec3 specularColors[8];

uniform vec4 ambientColor;

uniform float invRadius;
uniform float heightScale;

varying float angle;

void main ()
{
vec4 vAmbient = ambientColor;
vec3 vVec = normalize(eyeVec);

float height = texture2D(textureUnit1, texCoord).a;
vec2 offset = vVec.xy * (height * 2.0 - 1.0) *heightScale;
vec2 newTexCoord = texCoord + offset;

vec4 base = texture2D(textureUnit0, newTexCoord);
vec3 bump = normalize(texture2D(textureUnit1, newTexCoord).xyz * 2.0 - 1.0);

// First light source

//float distSqr = dot(lightVec[0], lightVec[0]);
float distSqr = min(65500.0, dot(lightVec[0], lightVec[0]));
float att = clamp(1.0 - invRadius * sqrt(distSqr), 0.0, 1.0);
vec3 lVec = lightVec[0] * inversesqrt(distSqr);

float diffuse = max(dot(lVec, bump), 0.0);
vec4 vDiffuse = vec4(diffuseColors[0],0) * diffuse;

float specular = pow(clamp(dot(reflect(-lVec, bump), vVec), 0.0, 1.0), 0.85);
vec4 vSpecular = vec4(specularColors[0],0) * specular;

gl_FragColor = (vAmbient*base + vDiffuse*base + vSpecular) * att*2.0*angle;
}


...and the corresponding vextex shader:


uniform mat4 modelViewMatrix;
uniform mat4 modelViewProjectionMatrix;

uniform vec4 additionalColor;
uniform vec4 ambientColor;

uniform vec3 lightPositions[8];

attribute vec4 position;
attribute vec3 normal;
attribute vec4 tangent;
attribute vec2 texture0;

varying vec3 lightVec[2];
varying vec3 eyeVec;
varying vec2 texCoord;
varying float angle;

void main(void)
{
texCoord = texture0.xy;

vec3 n = normalize(modelViewMatrix * vec4(normal,0.0)).xyz;
vec3 t = normalize(modelViewMatrix * vec4(tangent.xyz, 0.0)).xyz;

vec3 b = tangent.w*cross(n, t);

vec3 vVertex = vec3(modelViewMatrix * position);
vec3 tmpVec = lightPositions[0].xyz - vVertex;

angle = max(0.0, dot(n, normalize(tmpVec)));
angle=1.0;

vec3 lv;
vec3 ev;

lv.x = dot(tmpVec, t);
lv.y = dot(tmpVec, b);
lv.z = dot(tmpVec, n);

lightVec[0]=lv;

tmpVec = vVertex*-1.0;
eyeVec.x = dot(tmpVec, t);
eyeVec.y = dot(tmpVec, b);
eyeVec.z = dot(tmpVec, n);


gl_Position = modelViewProjectionMatrix * position;
}


When using these, make sure to assign the shader before calling build() on the Object3D, so that jPCT-AE can do it's magic and inject the tangent vectors to the shader.

EgonOlsen

And for the plants, here's the fragment shader:


precision mediump float;

uniform sampler2D textureUnit0;

varying vec2 texCoord0;

varying vec4 vertexColor;
varying vec4 fogVertexColor;

void main (void) {
gl_FragColor = vertexColor*texture2D(textureUnit0, texCoord0) + fogVertexColor;
}


and the vertex shader:

attribute vec2 texture0;
attribute vec4 position;
attribute vec3 normal;

uniform vec4 additionalColor;
uniform vec4 ambientColor;

uniform mat4 modelViewMatrix;
uniform mat4 modelViewProjectionMatrix;

uniform float fogStart;
uniform float fogEnd;
uniform vec3 fogColor;

uniform vec3 lightPositions[8];
uniform vec3 diffuseColors[8];

uniform float alpha;
uniform float random;
uniform float muly;

varying vec2 texCoord0;
varying vec4 vertexColor;
varying vec4 fogVertexColor;

const vec4 WHITE = vec4(1.0,1.0,1.0,1.0);

void main(void)
{
texCoord0=texture0;

vertexColor = additionalColor+ambientColor;

vec3 vVertex = vec3(modelViewMatrix * position);
vec3 normalEye   = normalize(modelViewMatrix * vec4(normal, 0.0)).xyz;
float angle = dot(normalEye, normalize(lightPositions[0] - vVertex));

if (angle > 0.0) {
vertexColor += vec4(diffuseColors[0] * angle,0.0);
}

vertexColor=2.0*vec4(min(WHITE, vertexColor).xyz, alpha*0.5);

float fogWeight = clamp((-vVertex.z - fogStart) / (fogEnd - fogStart), 0.0, 1.0);
fogVertexColor = vec4(fogColor,0.0) * fogWeight;
fogWeight = (1.0-fogWeight);

vec4 pos= modelViewProjectionMatrix * position;

vertexColor*=fogWeight;

if (texCoord0.y<0.5) {
    if (pos.z<2000.0) {
float cosy=cos(mod(random*0.005+position.z*0.5, 6.2831))*muly;
pos.x=pos.x+cosy;
pos.y=pos.y-cosy;
}
vertexColor*=1.15;
}

gl_Position = pos;
}


It basically is a modified default shader (you can unzip the jPCT-AE to gain access to the default shaders' sources if needed). The random uniform in that shader isn't really a random value (I guess it was at some time, hence the name) but the time in ms mod 65536.

Some info an GPUs and their shader related quirks: http://www.jpct.net/wiki/index.php?title=GPU_guide

Varanda

Fantastic. I never wrote a GLSL shader but I was debugging a port for Mac where we where getting GLSL compiler errors in runtime. It was a collaboration for the following Blender branch done by Clemment:

http://www.clement-foucault.com/#blender_pbr

Anyways... it is great to have something to start up.

Thanks, Thanks, Thanks !!!

EgonOlsen

As said: The default shaders are all part of the jar file and can be extracted. They look ugly in parts, but they are proven to run fine on any device that I'm aware of (which is why they look like that...).

Varanda

Thanks. I know what you mean. I wrote the live streaming stack for BB10 (at QNX). I built a nice code base and have implemented the full stack in 4~5 months. Then, I spent 1.5 years or so making that stack to work with dozens of media content providers who had different interpretation of protocol, codecs and everything else you can think of. My beautiful base code became a mess in order to work with all providers.