1 year ago

#231159

test-img

Pankaj Bansal

Using single image as cube map in OpenGL GLSL

I am working on a Augmented Reality application using ARCore. I am drawing an object in my scene (assume it is an sphere) using OpenGL and GLSL shaders. I want to do environment mapping on my object using the ARCore background texture image. I know that to create an environment map or cube map, I need 6 images in openGL. In AR application, we only have access to the current visible area and it is given by AR SDK as single texture. I want to find a way to divide this image in 6 logical parts. I have seen some examples of it Convert 2:1 equirectangular panorama to cube map, but could not find anything clear. Also, these examples are dividing the image in actual 6 parts, but I would prefer to change my texture coordinates in the fragment shader to be transformed to take care of it instead of actually dividing the image and passing as cube map texture uniform per frame.

I am attaching the image specs and how I am going to divide it to map to 6 faces of cube. enter image description here

Here is an pseudo code of what I am trying to do in fragment GLSL shader. I am looking for a way to convert the reflected direction coordinate to 2D texture coordinate. If I had a cubeMap, I could have just sampled the cubeMap using the 3D direction. As I have only 1 image, I want to convert this 3D direction in to 2D texture coordinate assuming we divide the image in 6 logical parts of cube, and this cube is covering the object I am drawing. Please look at the math part if it looks correct.

uniform sampler2D ARCoreSampler;

vec3 Normal = normalize(v_normal);
vec3 ViewVector = normalize(v_camera_pos - v_model_pos);
vec3 direction= reflect(ViewVector, Normal);
direction = normalize(direction);

float unitsy = 4.0;
float unitsx = 3.0;

float ztan = atan(direction.z, direction.x);
float ytan = atan(direction.y, direction.x);

vec2 uv;
//Top face
if (ytan>=M_PI/4.0 && ytan < 3.0*M_PI/4.0) {
    uv = direction.xz;
    uv.x = uv.x/unitsx;
    uv.y = uv.y/unitsy;
    uv.y += 3.0/unitsy;
    uv.x += 1.0/unitsx;
}//bottom face
else if (ytan>=5.0*M_PI/4.0 && ytan < 7.0*M_PI/4.0) {
    uv = direction.xz;
    uv.x = uv.x/unitsx;
    uv.y = uv.y/unitsy;
    uv.y += 1.0/unitsy;
    uv.x += 1.0/unitsx;
}// front face
else if (ztan>=M_PI/4.0 && ztan<3.0*M_PI/4.0) {
    uv = direction.xy;
    uv.x = uv.x/unitsx;
    uv.y = uv.y/unitsy;
    uv.y += 2.0/unitsy;
    uv.x += 1.0/unitsx;
}// Left face
else if (ztan>=3.0*M_PI/4.0 && ztan<5.0*M_PI/4.0) {
    uv = direction.zy;
    uv.x = uv.x/unitsx;
    uv.y = uv.y/unitsy;
    uv.y += 2.0/unitsy;
}// Back  face
else if (ztan>=5.0*M_PI/4.0 && ztan<7.0*M_PI/4.0) {
    uv = direction.xy;
    uv.x = uv.x/unitsx;
    uv.y = uv.y/unitsy;
    //uv.y += 1.0/unitsy;
    uv.x += 1.0/unitsx;
}// Right face
else if (ztan>=7.0*M_PI/4.0 && ztan<M_PI/4.0) {
    uv = direction.zy;
    uv.x = uv.x/unitsx;
    uv.y = uv.y/unitsy;
    uv.y += 2.0/unitsy;
    uv.x += 2.0/unitsx;
}

vec4 envColor = texture(ARCoreSampler, uv);

opengl-es

glsl

augmented-reality

arcore

texture-mapping

0 Answers

Your Answer

Accepted video resources