Introduction

It is well recognized that numerical methods based on a spatial discretization (as the finite element method, the boundary element method, the finite volume method, and others) play an important role in the systems of computer aided design (CAD), engineering (CAE), and manufacturing (CAM). Namely, the finite element method (FEM) is probably the most widespread analysis technique in the engineering community. This technique is capable of solving field problems governed by partial differential equations for very complex geometries. However, successful and efficient use of FEM still requires significant expertise, time, and cost. Many researchers are investigating ways to automate FEM, thus allowing improved productivity, more accurate solutions, and use by less trained personnel. Often the most time consuming and experience requiring task faced by an analyst is the discretization of the general geometric definition of a problem into a valid and well conditioned finite element mesh. The accuracy and cost of the analysis directly depends on the size, shape, and number of elements in the mesh. Automatic generation of consistent, reproducible, high quality meshes without user intervention makes the power of the finite element analysis accessible to those not expert in the mesh generation area. Therefore tools for an automated and efficient mesh generation are important prerequisites for the complete integration of the FEM with design processes in CAD, CAE, and CAM systems.

Most of the research on development of fully automatic unstructured mesh generators has been concentrated on various triangulation schemes. The advantage of them lies in the fact that simplicial elements (triangles and tetrahedra) are most suitable to discretize domains of arbitrary complexity, particularly when locally graded meshes are needed. Over the past decades, a wide class of algorithms for the generation of triangular and tetrahedral meshes has been established from which three basic strategies -- tree based approach, advancing front technique, and Delaunay triangulation -- have proved particularly successful. While the meshing schemes for the discretization of 2D problems matured into very robust and efficient algorithms, there are still many open issues in 3D, including not only theoretical guarantee of convergence, quality bounds but also implementation aspects as robustness and versatility. Moreover, the successful development of an optimal, fast, and robust algorithm suitable for all possible applications is improbable.

Octree based algorithms [1,2,3,4,5,6] proved to be quite robust and efficient, however they suffer from rather complicated implementation, poor quality elements along boundary, missing capability to generate anisotropic meshes, and the limited flexibility of the mesh density control. The advancing front technique [7,8,9,10,11,12,13] is a heuristic method appealing especially due to its nice point placement strategy and high quality elements along the boundary. However, relatively high computational demands, the lack of robustness, and the absence of the proof of termination for 3D geometries are considered to be a significant deficiency. The algorithms based on the Delaunay empty-circumsphere property [14,15,16,17,18,19,20,21] are attractive for their solid mathematical background and simple and efficient implementation. However, the Delaunay triangulation is generally not boundary conforming and in contrast to planar geometries, the existence of the constrained Delaunay triangulation is generally not guaranteed. Moreover the boundary recovery process in 3D is rather complicated and computationally demanding.

There are generally two approaches available to retrieve the missing boundary constraints. The first one [22,23,24,25,26,27,28,29,30,31] is based on a posteriori recovery using additional (Steiner) points inserted on the boundary to make the boundary appear in the resulting triangulation in slightly different (refined) form which is geometrically and topologically similar to the original boundary. The resulting mesh is therefore boundary conforming in a weak sense. Only a few attempts are evidenced in the literature [23,25] to make the final triangulation conforming to the original constraints. These are usually based on the retriangulation of a polyhedron formed by all elements connected to each of additionally inserted points. Alternatively, the missing constraints are recovered using a flipping post-processing procedure [32,33]. However, cases when the flipping transformation cannot be accomplished are finally resolved by insertion of Steiner points. The second approach [34,35,36] tries to a priori redefine the constraints, again by inserting additional points, in a form yielding new constraints which are geometrically and topologically similar to the original ones and which appears in the standard Delaunay triangulation. The significant drawback of the a priori approach (including the methodology proposed in this paper) is the fact, that the appearance of all constraining faces has to be verified, typically using floating point operations, while in the a posteriori approach the presence of individual constraining faces is checked using topological information and only the missing faces are recovered. Various strategies for additional point insertion appear in the literature [30,31,35] depending whether the points are inserted only on constraining edges or also on constraining faces. The termination of the insertion of additional points is usually ensured using the concept of a protective sphere which is either enforced explicitly [30,35] or implicitly [31].

In the present paper, a different approach is adopted to generate a boundary conforming mesh. The key idea is to build from the beginning the constrained Delaunay triangulation using the modified Watson's point insertion algorithm. The modification is related to the creation of the cavity by preventing to remove existing tetrahedra attached to constraining face on opposite side than the point being inserted. This implies that the in-sphere predicate, used to verify the empty-circumsphere property, has to be accompanied by the in-plane predicate to reliably ensure that all sides of the cavity are visible from the point being inserted. But this only ensures that the newly inserted point does not violate the boundary conformity of constraining faces already present in the intermediate constrained Delaunay triangulation. The actual appearance of inserted constraining face in the final triangulation is achieved by proper ordering of point insertion. This ordering is driven by a dependency graph built according to the violation of the empty-circumsphere property of each of constraining faces. However, the dependency graph can be successfully used for point insertion only if there are no cyclic dependencies. Therefore all cyclic dependencies are eliminated from the violation dependency graph before the point insertion is started. There are several ways to get rid of undesirable violations, namely to perturb the point location, to classify the violations as safe, and as the last resort, to create a new tetrahedron (using AFT) attached to constraining face. After all cyclic dependencies are removed, the boundary points are inserted using the modified Watson's algorithm following the dependency hierarchy of the dependency graph. In this way, the constrained Delaunay triangulation respecting all constraining faces (including the new ones created together with new tetrahedra) is constructed. This triangulation is then enriched by the insertion of internal points, while keeping all the relevant boundary constraints.

The outline of the paper is as follows. In Section 2, the definition of the constrained Delaunay triangulation is recalled. An a priori boundary recovery process used to ensure the appearance of constraining faces in the resulting triangulation is described in Section 3. The actual domain discretization concept is briefly sketched in Section 4. In Section 5, a discussion related to various aspects of the proposed approach together with future work directions can be found. And the paper ends with concluding remarks in Section 6.

*Daniel Rypl
2005-11-06*