Upcoming SlideShare
×

# ShaderX³: Geometry Manipulation - Morphing between two different objects

660
-1

Published on

That is an article from 2004, which had been published in the book "ShaderX 3".

Published in: Technology, Art & Photos
0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total Views
660
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
0
0
Likes
0
Embeds 0
No embeds

No notes for slide

### ShaderX³: Geometry Manipulation - Morphing between two different objects

1. 1. GEOMETRY MANIPULATION: MORPHING BETWEEN TWO DIFFERENT OBJECTS MAY 7, 2013THEORYIn the early nineties a movie impressed the audience by showing computergenerated effects, which have never been seen before. “Terminator 2 – JudgmentDay” can be called the beginning of photo-realistic computer graphics in movies.The most important effects were the various transformations of T-1000, which wasenemy machine in the story. Those transformations were made by a techniquecalled ‘morphing’. That can be done in image space, where one two-dimensionalimage or video source is transformed into another. For “Terminator 2” it was donethree-dimensional, which means that one 3D mesh is being transformed intoanother. Both versions are not meant to be used in real time, but it is able withtoday’s graphics hardware. We will only look at an implementation of the 3Dversion.Vertex tweening is an easy way to move a vertex of a mesh independently fromothers. Here every vertex has got a relative or absolute destination position vectorbeside the source one. With a dynamic blending factor, which is equal for allvertices at a time, you can interpolate between source and destination position.The formula looks like this for a relative destination:PositionOutput= PositionSource+ PositionDestination ⋅ FactorAnd with an absolute destination position we need to calculate the relative one:PositionOutput = PositionSource + ( PositionDestination − PositionSource )⋅FactorThe positions are 3D vectors with x, y and z components and the blending factor isa scalar value. For the article we will only use relative destination vectors, becauseit saves rendering time and code as you can see by comparing the two formulasabove.Using only this technique results in a lot of limits, because start and target meshare the same beside vertex positions. That means there is no difference in numberof vertices and the faces and attributes are the same. Start and target mesh havegot equal materials, textures, shaders and further states. So vertex tweening isonly useful to animate objects in a way, where mesh skinning fails.To morph between two different objects we can use vertex tweening, but we willtransform and render both meshes at once. Beforehand the destination positions ofthe mesh vertices have to be calculated. This can be done by projecting thevertices of the first mesh to the second one and vice versa. We use the vertexposition as origin for a ray-mesh-intersection test, which is a function that checks ifand where a ray intersects at least one face of a mesh. If there are multipleintersections the nearest is a good vector to use as destination position. Withoutany intersection source should be used as destination. In this case the relativedestination is the zero vector.For the intersection ray we also need a direction vector beside the origin. Thiscan be the normal of a vertex or we calculate a vector from origin to the mesh or auser-defined center. We also should invert the direction vector to get all possibleintersections. That is not needed if an origin is situated out of both meshes since©2004 RONNY BURKERSRODA PAGE 1
2. 2. GEOMETRY MANIPULATION: MORPHING BETWEEN TWO DIFFERENT OBJECTS MAY 7, 2013we do not have to use the vertex position as origin. For example it is possible touse the bounding sphere of a mesh:Direction = Center − PositionOrigin = −Direction⋅ Radius + CenterThis is very useful if you have got complex objects like helicopters with cockpitinterior. Using the bounding sphere projects every vertex of one mesh to the hullof the other one. Otherwise it could be possible that some hull vertices areprojected to faces of the interior. Choosing the best values always depends on thekind of mesh design. After the destination vector is computed we store it into thevertex data.Now we know where a vertex has to be moved to get an object that hasstructures like another one. It is possible to tessellate objects to increase theaccuracy of those structures, but you do not have to, because we want to use thegood performance of optimized low-polygon objects. Other tricks to improvequality are being descripted later in this article.After the preprocessing is done we are able to render the objects with themorphing effect. This can be done by the application but we are concentrating onvertex shader, because this improves performance on DirectX-8-compatiblegraphics cards and works in software mode for older hardware.We have to set the current interpolation factor as shader constant to render amorphing mesh. For the target one the factor has to be inverted by subtracting itfrom one. In this way both meshes are always at same state. Now we use the factorto interpolate between source and destination position. Other vertex processinglike lighting and texture coordinate transformation can be done normally. It ispossible to render both objects per frame. Or we only render the start mesh to thehalf and then the target one. This can look strange or ugly but there areoptimizations, which can be done (see Optimization part).This is a screenshot from LightBrain’s game prototype “Rise of the Hero” implementing themorphing effect, which was originally created for that project.©2004 LightBrain GmbH, Hamburg, Germany©2004 RONNY BURKERSRODA PAGE 2
3. 3. GEOMETRY MANIPULATION: MORPHING BETWEEN TWO DIFFERENT OBJECTS MAY 7, 2013IMPLEMENTATIONI am using DirectX 9 to show an implementation of the morphing algorithm.DirectX extensions will help me to save time, so the basic functions of 3D graphicsprogramming will not be implemented here. For experienced OpenGL programmersit should be now problem to convert it or write an own program on the base of thealgorithm.For the objects we are able to use extension meshes, whose objects are accessedover the ID3DXMesh interface. A mesh is storing vertices, an index list of trianglefaces between them and a table of attributes for the triangles. The attributes areidentification numbers, which divide the mesh into different subsets that can berendered with various states like materials or textures.It is possible to load a mesh over the D3DXLoad[…]MeshX[…] functions or todefine an own mesh by locking vertex, index and attribute buffer and setting thedata. For now we are going the first way and are loading the meshes from commonDirectX files, which other programs are able to read and write. Beside the mesheswe are getting an array of materials including texture file names for all subsets. Asubset is rendered by setting material and textures first and then callingID3DXMesh::DrawSubset( nSubset ), where nSubset is the number of thesubset.To preprocess the meshes we have to enhance the vertex data first, so therelative destination position can be stored to. There are two formats in Direct3D todefine the structure of a vertex: Flexible vertex formats are the one to use forfixed function pipeline processing, which transforms the vertex data with a set offunctions. The parameters of those functions can be set over Direct3D. Because thepossibilities of the functions were limited vertex shaders, in which the processingcan be programmed, had been introduced. For vertex shaders there is a much moreflexible format: The vertex declaration allows everybody to include all data, whichis needed. We are using vertex shaders so we will also use such declarations. Firstthey seem to be more complicated but they enable us to be more compatible toother effects.A declaration is defined by an array of D3DVERTEXELEMENT9 elements and thelast one must include the data of the D3DDECL_END() macro. Every other elementdefines one data element of a vertex by setting the offset in the vertex data (inbytes), the type of the data (e.g. D3DDECLTYPE_FLOAT3 for a 3D vector), amethod for hardware tessellation, the usage (e.g. position or texture coordinate)and the index number, if more then one element of the same usage are stored.Because those declarations are also used to pass on vertex data to the pipeline, astream number can be specified, too. In this way multiple vertex buffers are usedto render one set of primitives. But our meshes include only one vertex buffer, sothe stream number should be set to zero.A common 3D vertex includes position and normal vector and one 2D texturecoordinate. The vertex declaration for such a vertex looks like this:D3DVERTEXELEMENT9 pStandardMeshDeclaration[] ={{ 0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT,©2004 RONNY BURKERSRODA PAGE 3
4. 4. GEOMETRY MANIPULATION: MORPHING BETWEEN TWO DIFFERENT OBJECTS MAY 7, 2013D3DDECLUSAGE_POSITION, 0 },{ 0, 12, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT,D3DDECLUSAGE_NORMAL, 0 },{ 0, 24, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT,D3DDECLUSAGE_TEXCOORD, 0 },D3DDECL_END()};At the beginning of the vertex data (byte 0) there is a 3D vector for the standardvertex position. It is followed by the 3D vector of the vertex normal. The data ispositioned at offset 12 because vector of the position include 3 FLOAT values. AFLOAT value has got the size of 4 bytes (= sizeof( FLOAT )) and themultiplication with 3 elements results in 12 bytes. Because the normal has thesame size the offset of the texture coordinate is at 24 (= 2 vectors * 12 bytes). Butthe texture coordinate is only a 2D vector. That means the vertex data has a size of32 bytes (= 2 vectors * ( 3 floats * 4 bytes ) + 1 vector * ( 2 floats * 4 bytes )). This isimportant because we want to add an element for the destination position:D3DVERTEXELEMENT9 pMorphingMeshDeclaration[] ={{ 0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT,D3DDECLUSAGE_POSITION, 0 },{ 0, 12, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT,D3DDECLUSAGE_NORMAL, 0 },{ 0, 24, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT,D3DDECLUSAGE_TEXCOORD, 0 },{ 0, 32, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT,D3DDECLUSAGE_POSITION, 1 },D3DDECL_END()};Now we have added a second position vector, which must have an increased usageindex than the standard one. The whole enhancement can be done automaticallywith the following steps.We use D3DXGetDeclVertexSize() to retrieve the vertex size of the originaldeclaration and we are going through the declaration to store the highest usageindex of a position element. At next the destination position element for morphingcan be set to the D3DDECL_END() entry. D3DXGetDeclLength() returns thenumber of this entry increased by one. As usage index we take the highest indexand add one to it. This last thing is to write the D3DDECL_END() at the end. IfpStandardMeshDeclaration was the original declaration it has been enhancedto pMorphingMeshDeclaration. You can see the routine at listing 1.D3DVERTEXELEMENT9 pMeshDeclaration[ MAX_FVF_DECL_SIZE ];DWORD nPosition = 0;DWORD nUsageIndex = 0;DWORD nOffset;...// process all declaration elements until end is reached©2004 RONNY BURKERSRODA PAGE 4
5. 5. GEOMETRY MANIPULATION: MORPHING BETWEEN TWO DIFFERENT OBJECTS MAY 7, 2013while( pMeshDeclaration[ nPosition ].Stream != 0xFF ){// check for higher index of a position usageif(( pMeshDeclaration[ nPosition ].Usage== D3DDECLUSAGE_POSITION )&& ( pMeshDeclaration[ nPosition ].UsageIndex>= nUsageIndex ))nUsageIndex = pMeshDeclaration[ nPosition ].UsageIndex + 1;// increase position in declaration array++nPosition;}// get element number for new entrynPosition = D3DXGetDeclLength( pMeshDeclaration ) - 1;nOffset = D3DXGetDeclVertexSize( pMeshDeclaration, 0 );// move end elementmemmove( &pMeshDeclaration[ nPosition + 1 ] ,&pMeshDeclaration[ nPosition ], sizeof( D3DVERTEXELEMENT9 ) );// add new position elementpMeshDeclaration[ nPosition ].Stream = 0;pMeshDeclaration[ nPosition ].Offset = nOffset;pMeshDeclaration[ nPosition ].Type = D3DDECLTYPE_FLOAT3;pMeshDeclaration[ nPosition ].Method = D3DDECLMETHOD_DEFAULT;pMeshDeclaration[ nPosition ].Usage = D3DDECLUSAGE_POSITION;pMeshDeclaration[ nPosition ].UsageIndex = nUsageIndex;Listing 1. Enhancing the vertex declaration for a morphing mesh.The next step is to clone the start mesh using the new declaration as parameter.ID3DXMesh::Clone() creates a new mesh object with the same data as theoriginal one but including space for the destination position. If you do not want touse the original mesh any longer (e.g. for rendering it without morphing) it can bereleased.The vertex buffer of the cloned mesh must be locked now, so we can calculate itsdestination positions. Every vertex has to be projected to the target mesh. To dothis there is an extension function of Direct3D: D3DXIntersect() checks where aray intersects an extension mesh. We can use the ray origin and direction we wantand will get all possible projection points. As I mentioned it is most useful to takethe nearest one. The source position has to be subtracted to get the relativedestination vector, which can be stored to the vertex data (see listing 2).Fortunately reading and writing vertex data is not as hard as it seems to be. Vertexdeclarations make it easy to get the offset of a specific vertex element. To retrievethe source position we should look for an element of type D3DDECLTYPE_FLOAT3,usage D3DDECLUSAGE_POSITION and index 0. And to get the normal the usagehas to be D3DDECLUSAGE_NORMAL. Then we take the offset to read the 3D vectorfrom the vertex data. Accessing a specific vertex is possible by doing the following:VOID* pVertex = (BYTE*) pData + nVertexSize * nVertex;©2004 RONNY BURKERSRODA PAGE 5
6. 6. GEOMETRY MANIPULATION: MORPHING BETWEEN TWO DIFFERENT OBJECTS MAY 7, 2013pData is the start address of the vertex buffer data, nVertexSize is the size ofone vertex, which can be calculated by calling D3DXGetDeclVertexSize(), andnVertex is the number of the vertex that should be accessed. pVertex stores theaddress of this vertex and can be used to read and write the vectors:D3DXVECTOR3 vct3SourcePosition= *(D3DXVECTOR3*)( (BYTE*) pVertex + nOffsetSourcePosition );...*(D3DXVECTOR3*)( (BYTE*) pVertex + nOffsetDestinationPosition )= vct3DestinationPosition;The offsets, which we got from vertex declaration, are stored innOffsetSourcePosition for source and nOffsetDestinationPosition fordestination position.ID3DXMesh* pmshDestination; // pointer to destination mesh interfaceD3DVECTOR3 vct3Source; // source position (vertex input)D3DVECTOR3 vct3Destination; // destination position (vertex output)D3DVECTOR3 vct3Direction; // ray direction vectorD3DVECTOR3 vct3Center; // bounding sphere center (parameter)FLOAT fRadius; // bounding sphere radius (parameter)FLOAT fDistance; // distance from sphere to meshBOOL bIntersection; // intersection flag...// calculate direction from sphere center to vertex positionD3DXVec3Normalize( &vct3Direction,&D3DXVECTOR3( vct3Center - vct3Source ) );// compute intersection with destination mesh from outstanding pointon the bounding sphere in direction to the centerD3DXIntersect( pmshDestination,&D3DXVECTOR3( vct3Center - vct3Direction * fRadius ),&vct3Direction, &bIntersection, NULL, NULL, NULL, &fDistance,NULL, NULL );// check for intersectionif( bIntersection ){// calculate projected vector and subtract source positionvct3Destination = vct3Center + vct3Direction *( fDistance - fRadius ) - vct3Source;}else{// set relative destination position to zerovct3Destination = D3DXVECTOR3( 0.0f, 0.0f, 0.0f );}Listing 2. Calculating the destination position for a vertex©2004 RONNY BURKERSRODA PAGE 6