2. What is Photogrammetry?
• It is “the science of making
measurements from photographs.” But
in the context of our industry, it is the
process of converting photos to 3d.
• More Commonly used in VFX.
• Games such as The Vanishing of
Ethan Carter have used it effectively.
3. • “Most of things that photographers learn is the
opposite of what photogrammetry requires.”
• “Avoid high contrast lighting”
• “Anything that brings out highlights in your photos,
will yield terrible results.”
• “Everything in your photos should be static,
including background and lighting.”
- ANDRZEJ POZNANSKI
Example of a bad capture, by ANDRZEJ POZNANSKI
How to Properly Take
Photogrammetry Photos
4. Guide from Kevin Hudson
• “Most of things that photographers learn is the
opposite of what photogrammetry requires.”
• “Avoid high contrast lighting”
• “Anything that brings out highlights in your photos,
will yield terrible results.”
• “Everything in your photos should be static,
including background and lighting.”
- ANDRZEJ POZNANSKI
How to Properly Take
Photogrammetry Photos
5. Keep in mind that the more photos you take, the better. Here’s 3 out of 70 captures I took of this cave column.
The Reed Flute Cave - Guilin, Guangxi, China
6. • For converting the photos to the 3d model, I
used Autodesk 123d Catch.
• I basically walked around this object and
took a photo every 2 steps. The red arrows
in the image points to the white cameras,
which pinpoints where I stood when I took
the pictures. ( please note that 123d catch
accepts a max of 70 photos)
7. • Delete major unwanted areas in 123d Catch
and/or Maya.
• Further refine and prep the mesh in Maya
before bringing into zBrush for reducing
polygon count and modifications.
• Make sure you scale to the proper size, as
there could be height distortion. Fill in any
holes and modify any artifacts.
8. • Use Dynamesh to resolve any artifacts or smoothing
errors.
• Import to Zbrush to use z-remesher to create the low-
poly version of this.
• Z-remesher’s blue/red poly paint defines which areas
you want to have a higher or lower polygon count.
9. • 11,456 tris after Z-remeshing. Not too terrible, considering this is just a simple Z-remesh. Definitely room for improvement.
• Using the Z-remesher’s polypaint and curve functions can optimize this even more.
• You can manually tweak the results in Topogun, 3d Coat, Maya, etc to further reduce the size.
10. • Delete faces that you will not be seeing (as seen in the first image).
• Make sure the lower poly version roughly matches the high.
• Sometimes Z-remesher is not totally perfect, (as seen in the 3rd image) but simple tweaks can fix it.
11. • For UV’s I am using Headus UVLayout and its handly little maya plugin. Make sure mesh is clean, or the plug-in
sometimes breaks.
• The Plug-in allows you to quickly fix bad geo or do little tweaks in maya as you are making Uvs in Headus. Without
this plug-in, constantly exporting and re-importing for little things can add up to a lot of time wasted.
12. • Add UV cuts to edges that are greater than 80 degrees. ALTERNATIVELY, if you bevel that edge, then cutting it is OPTIONAL.
• When positioning UV’s, make sure everything is horizontal or vertical as possible, especially if you use texturing software like Quixel.
• After everything is UV’ed, triangulate everything and then do a “Mesh Display -> Set to Face” and then next “Soften Edges” on the whole model.
• Next you must harden all the UV edges. To do this manually, go into Maya’s UV editor and double click on all of the UV edges to select them.
Then do a “Mesh Display -> Harden Edges” to harden the selection.
Preventing Normal Map Baking Errors and Ugly Seams, Part 1/5
13. To harden UV edges automatically, use this MEL script.
Simply go to object mode, click on your mesh, and run the
script. Credit goes to Jonathon Stewart for making it.
You can copy it here:
http://jonathonstewart.blogspot.com.au/2012/10/script-harden-edges-of-all-uv-borders.html
Preventing Normal Map Baking Errors and Ugly Seams, Part 2/5
14. • The next thing we must do is to make a cage for baking in xNormal. To do that, we first duplicate the low-poly mesh, put it in a
separate layer, and give it a transparent material to make things easier. This will become your cage for baking your normal map.
• Next we go to Move Tool Settings and change the axis orientation to “Normal”. Then we select all the vertices on the cage layer and
pull on the “N” of the manipulator to make this mesh slightly cover the original low poly mesh (As seen in the first image).
• Now we can export the low poly, high poly, and the cage as an obj and plug it into xNormal. Finally we can press Generate Maps to
make the Normal and AO maps.
Preventing Normal Map Baking Errors and Ugly Seams, Part 3/5
15. • Using Marmoset Toolbag, we quickly check to see if there are any baking errors. UV seams are gone once I plug in
the normal map. Everything seems to check out.
• If the normal map looks bad, but you have followed all the previous steps properly, chances are the Y channel is
flipped. Make sure you test flipping the Y channel before you go looking in the wrong places for answers.
• We are now ready to bring this into Substance, Quixel, or Mari to texture.
Preventing Normal Map Baking Errors and Ugly Seams, Part 4/5
Lowpoly + Normal MapLowpoly only Lowpoly + Normal Map + AO
16. • If you have some details that are skewed in your normal maps, simply add more geometry onto the area that is
giving you problems.
• Most of the time you won’t have to worry about this unless it’s a hardsurface model.
Preventing Normal Map Baking Errors and Ugly Seams, Part 5/5
Credit for these images goes to Earthquake
17. • At this stage, it is up to you to decide how
you want to create textures, whether it be
just photoshop, mari, substance, or
something else.
• In this case, I plugged it into Quixel and just
used a Smart Material as a base and to see
how it interacts with the maps.
Ready for texturing
18. Of course there is a lot more room for improvement and optimization. However this shows the potential of
photogrammetry and how quickly you can produce results.
19. • This is mostly important for Hardsurface models.
• Often times, on someone’s high poly model, there will be
someone that doesn’t smooth the edges enough because they
don’t consider viewing the model from afar as much.
• Take into account how far away the viewer will see the model
for determining how soft the edges are.
• On the left, the models edge is jaggy, and the effort done to
create that subtle soft edge and normal map is wasted when
viewed a bit further away.
• From afar, you can barely see the smoothness of the right
model, but a bunch of small changes can make a big difference.
• Sorry, no TLDR version! :)
More Tips for Hardsurface Baking:
Credit for this image goes to Zoya