How do you apply a normal map to a 3d mesh?

Computer Graphics Asked by Calvin Godfrey on August 27, 2021

I’m writing my own mesh renderer, and I previously was able to apply a normal map to a sphere, so I understand the basic process of applying a normal map. But my understanding is that in order to apply it, you need a consistent tangent space at every point on the mesh. It’s possible on a sphere because you can calculate the spherical coordinates and then pick orthogonal tangents parallel to lines of longitude/latitude, but as far as I can tell, there’s no clear way to do that to a mesh.

To give a specific example, I found this free .obj file online, and I can render it with textures, just without normal mapping. But I can open up the model in the generic Windows "3D Viewer", and in that program, it has normal mapping applied, so clearly it’s possible. I couldn’t find anything online, is there a way to do this?

There are numerous approaches to setting up tangent bases on a mesh, and unfortunately, no totally universal standard for how they are calculated.

Tangents are based on the mesh's UV mapping, so that the tangent vector points (at least roughly) along the U axis in texture space, and the bitangent along the V axis. This means each triangle in a mesh has its own tangent basis (as long as the UV mapping is nondegenerate). This often gets converted into a smoothed tangent basis per vertex, much like you can create smoothed normals by setting each vertex normal to an average over the normals of the surrounding triangles. Different tools and engines might do the smoothing in different ways, might handle UV seams and mirroring differently, might orthonormalize the basis per vertex or per pixel or not at all, etc. Here is a nice article to read if you want to know more. And as noted there, there is a method for tangent spaces called mikktspace that is gradually gaining widespread support and might become a de facto standard.

Object/character models like the one you linked usually have normal maps baked out from a content creation tool (e.g. based on an ultra-high-detail source mesh); ideally, you need the tangent basis you use in your renderer to match the tangent basis used in the tool to do the bake. Sometimes there will be vertex tangents stored in the mesh, but the .obj format doesn't support tangents so you would need to see if the model has tangents in one of the other formats, like .fbx.

Answered by Nathan Reed on August 27, 2021

Related Questions

Corrupt values when writing and reading from the same RWTexture2D in HLSL/DirectX?

0  Asked on August 27, 2021 by b1skit

BRDF for point lights should not return values over 1

1  Asked on August 27, 2021 by emil-kabirov

Approximating Geodesics in a half edge DS, how can I refine my mesh to get good approximations

1  Asked on August 27, 2021 by makogan

How to translate the center of an equirectangular projection?

1  Asked on March 3, 2021 by lucio-coire-galibone

Properties of the image reconstruction filter in rendering

2  Asked on February 10, 2021

Dynamic Array in GLSL

3  Asked on January 13, 2021 by archmede

Applying voltages to conductors in passive-matrix LCD

0  Asked on January 5, 2021 by bisma

Combine box shadow with a signed distance field

0  Asked on January 2, 2021 by weichsem

Why is eye-based ray tracing preferred over light-based ray tracing?

2  Asked on August 12, 2020 by jheindel

Aligning (matching) colors (white balance, brightness) of two scenes based on reference object

0  Asked on July 23, 2020 by przemek-b