1 /* 2 ** $Header: /cvs/projects/ogl-sample/main/gfx/lib/glu/libtess/alg-outline,v 1.1 2000/04/26 05:53:59 ljp Exp $ 3 */ 4 5 This is only a very brief overview. There is quite a bit of 6 additional documentation in the source code itself. 7 8 9 Goals of robust tesselation 10 --------------------------- 11 12 The tesselation algorithm is fundamentally a 2D algorithm. We 13 initially project all data into a plane; our goal is to robustly 14 tesselate the projected data. The same topological tesselation is 15 then applied to the input data. 16 17 Topologically, the output should always be a tesselation. If the 18 input is even slightly non-planar, then some triangles will 19 necessarily be back-facing when viewed from some angles, but the goal 20 is to minimize this effect. 21 22 The algorithm needs some capability of cleaning up the input data as 23 well as the numerical errors in its own calculations. One way to do 24 this is to specify a tolerance as defined above, and clean up the 25 input and output during the line sweep process. At the very least, 26 the algorithm must handle coincident vertices, vertices incident to an 27 edge, and coincident edges. 28 29 30 Phases of the algorithm 31 ----------------------- 32 33 1. Find the polygon normal N. 34 2. Project the vertex data onto a plane. It does not need to be 35 perpendicular to the normal, eg. we can project onto the plane 36 perpendicular to the coordinate axis whose dot product with N 37 is largest. 38 3. Using a line-sweep algorithm, partition the plane into x-monotone 39 regions. Any vertical line intersects an x-monotone region in 40 at most one interval. 41 4. Triangulate the x-monotone regions. 42 5. Group the triangles into strips and fans. 43 44 45 Finding the normal vector 46 ------------------------- 47 48 A common way to find a polygon normal is to compute the signed area 49 when the polygon is projected along the three coordinate axes. We 50 can't do this, since contours can have zero area without being 51 degenerate (eg. a bowtie). 52 53 We fit a plane to the vertex data, ignoring how they are connected 54 into contours. Ideally this would be a least-squares fit; however for 55 our purpose the accuracy of the normal is not important. Instead we 56 find three vertices which are widely separated, and compute the normal 57 to the triangle they form. The vertices are chosen so that the 58 triangle has an area at least 1/sqrt(3) times the largest area of any 59 triangle formed using the input vertices. 60 61 The contours do affect the orientation of the normal; after computing 62 the normal, we check that the sum of the signed contour areas is 63 non-negative, and reverse the normal if necessary. 64 65 66 Projecting the vertices 67 ----------------------- 68 69 We project the vertices onto a plane perpendicular to one of the three 70 coordinate axes. This helps numerical accuracy by removing a 71 transformation step between the original input data and the data 72 processed by the algorithm. The projection also compresses the input 73 data; the 2D distance between vertices after projection may be smaller 74 than the original 2D distance. However by choosing the coordinate 75 axis whose dot product with the normal is greatest, the compression 76 factor is at most 1/sqrt(3). 77 78 Even though the *accuracy* of the normal is not that important (since 79 we are projecting perpendicular to a coordinate axis anyway), the 80 *robustness* of the computation is important. For example, if there 81 are many vertices which lie almost along a line, and one vertex V 82 which is well-separated from the line, then our normal computation 83 should involve V otherwise the results will be garbage. 84 85 The advantage of projecting perpendicular to the polygon normal is 86 that computed intersection points will be as close as possible to 87 their ideal locations. To get this behavior, define TRUE_PROJECT. 88 89 90 The Line Sweep 91 -------------- 92 93 There are three data structures: the mesh, the event queue, and the 94 edge dictionary. 95 96 The mesh is a "quad-edge" data structure which records the topology of 97 the current decomposition; for details see the include file "mesh.h". 98 99 The event queue simply holds all vertices (both original and computed 100 ones), organized so that we can quickly extract the vertex with the 101 minimum x-coord (and among those, the one with the minimum y-coord). 102 103 The edge dictionary describes the current intersection of the sweep 104 line with the regions of the polygon. This is just an ordering of the 105 edges which intersect the sweep line, sorted by their current order of 106 intersection. For each pair of edges, we store some information about 107 the monotone region between them -- these are call "active regions" 108 (since they are crossed by the current sweep line). 109 110 The basic algorithm is to sweep from left to right, processing each 111 vertex. The processed portion of the mesh (left of the sweep line) is 112 a planar decomposition. As we cross each vertex, we update the mesh 113 and the edge dictionary, then we check any newly adjacent pairs of 114 edges to see if they intersect. 115 116 A vertex can have any number of edges. Vertices with many edges can 117 be created as vertices are merged and intersection points are 118 computed. For unprocessed vertices (right of the sweep line), these 119 edges are in no particular order around the vertex; for processed 120 vertices, the topological ordering should match the geometric ordering. 121 122 The vertex processing happens in two phases: first we process are the 123 left-going edges (all these edges are currently in the edge 124 dictionary). This involves: 125 126 - deleting the left-going edges from the dictionary; 127 - relinking the mesh if necessary, so that the order of these edges around 128 the event vertex matches the order in the dictionary; 129 - marking any terminated regions (regions which lie between two left-going 130 edges) as either "inside" or "outside" according to their winding number. 131 132 When there are no left-going edges, and the event vertex is in an 133 "interior" region, we need to add an edge (to split the region into 134 monotone pieces). To do this we simply join the event vertex to the 135 rightmost left endpoint of the upper or lower edge of the containing 136 region. 137 138 Then we process the right-going edges. This involves: 139 140 - inserting the edges in the edge dictionary; 141 - computing the winding number of any newly created active regions. 142 We can compute this incrementally using the winding of each edge 143 that we cross as we walk through the dictionary. 144 - relinking the mesh if necessary, so that the order of these edges around 145 the event vertex matches the order in the dictionary; 146 - checking any newly adjacent edges for intersection and/or merging. 147 148 If there are no right-going edges, again we need to add one to split 149 the containing region into monotone pieces. In our case it is most 150 convenient to add an edge to the leftmost right endpoint of either 151 containing edge; however we may need to change this later (see the 152 code for details). 153 154 155 Invariants 156 ---------- 157 158 These are the most important invariants maintained during the sweep. 159 We define a function VertLeq(v1,v2) which defines the order in which 160 vertices cross the sweep line, and a function EdgeLeq(e1,e2; loc) 161 which says whether e1 is below e2 at the sweep event location "loc". 162 This function is defined only at sweep event locations which lie 163 between the rightmost left endpoint of {e1,e2}, and the leftmost right 164 endpoint of {e1,e2}. 165 166 Invariants for the Edge Dictionary. 167 168 - Each pair of adjacent edges e2=Succ(e1) satisfies EdgeLeq(e1,e2) 169 at any valid location of the sweep event. 170 - If EdgeLeq(e2,e1) as well (at any valid sweep event), then e1 and e2 171 share a common endpoint. 172 - For each e in the dictionary, e->Dst has been processed but not e->Org. 173 - Each edge e satisfies VertLeq(e->Dst,event) && VertLeq(event,e->Org) 174 where "event" is the current sweep line event. 175 - No edge e has zero length. 176 - No two edges have identical left and right endpoints. 177 178 Invariants for the Mesh (the processed portion). 179 180 - The portion of the mesh left of the sweep line is a planar graph, 181 ie. there is *some* way to embed it in the plane. 182 - No processed edge has zero length. 183 - No two processed vertices have identical coordinates. 184 - Each "inside" region is monotone, ie. can be broken into two chains 185 of monotonically increasing vertices according to VertLeq(v1,v2) 186 - a non-invariant: these chains may intersect (slightly) due to 187 numerical errors, but this does not affect the algorithm's operation. 188 189 Invariants for the Sweep. 190 191 - If a vertex has any left-going edges, then these must be in the edge 192 dictionary at the time the vertex is processed. 193 - If an edge is marked "fixUpperEdge" (it is a temporary edge introduced 194 by ConnectRightVertex), then it is the only right-going edge from 195 its associated vertex. (This says that these edges exist only 196 when it is necessary.) 197 198 199 Robustness 200 ---------- 201 202 The key to the robustness of the algorithm is maintaining the 203 invariants above, especially the correct ordering of the edge 204 dictionary. We achieve this by: 205 206 1. Writing the numerical computations for maximum precision rather 207 than maximum speed. 208 209 2. Making no assumptions at all about the results of the edge 210 intersection calculations -- for sufficiently degenerate inputs, 211 the computed location is not much better than a random number. 212 213 3. When numerical errors violate the invariants, restore them 214 by making *topological* changes when necessary (ie. relinking 215 the mesh structure). 216 217 218 Triangulation and Grouping 219 -------------------------- 220 221 We finish the line sweep before doing any triangulation. This is 222 because even after a monotone region is complete, there can be further 223 changes to its vertex data because of further vertex merging. 224 225 After triangulating all monotone regions, we want to group the 226 triangles into fans and strips. We do this using a greedy approach. 227 The triangulation itself is not optimized to reduce the number of 228 primitives; we just try to get a reasonable decomposition of the 229 computed triangulation. 230