Skip to main content

Advertisement

Log in

3D Landmarking in Multiexpression Face Analysis: A Preliminary Study on Eyebrows and Mouth

  • Innovative Techniques
  • Experimental/Special Topics
  • Published:
Aesthetic Plastic Surgery Aims and scope Submit manuscript

Abstract

The application of three-dimensional (3D) facial analysis and landmarking algorithms in the field of maxillofacial surgery and other medical applications, such as diagnosis of diseases by facial anomalies and dysmorphism, has gained a lot of attention. In a previous work, we used a geometric approach to automatically extract some 3D facial key points, called landmarks, working in the differential geometry domain, through the coefficients of fundamental forms, principal curvatures, mean and Gaussian curvatures, derivatives, shape and curvedness indexes, and tangent map. In this article we describe the extension of our previous landmarking algorithm, which is now able to extract eyebrows and mouth landmarks using both old and new meshes. The algorithm has been tested on our face database and on the public Bosphorus 3D database. We chose to work on the mouth and eyebrows as a separate study because of the role that these parts play in facial expressions. In fact, since the mouth is the part of the face that moves the most and affects mainly facial expressions, extracting mouth landmarks from various facial poses means that the newly developed algorithm is pose-independent.

No Level Assigned

This journal requires that authors assign a level of evidence to each submission to which Evidence-Based Medicine rankings are applicable. This excludes Review Articles, Book Reviews, and manuscripts that concern Basic Science, Animal Studies, Cadaver Studies, and Experimental Studies. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors http://www.springer.com/00266.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Similar content being viewed by others

References

  1. Alker M, Frantz S, Rohr K, Stiehl HS (2001) Improving the robustness in extracting 3D point landmarks from 3D medical images using parametric deformable models. In: Niessen WJ, Viergever MA (eds) Medical image computing and computer-assisted intervention—MICCAI 2001, 4th international conference, Utrecht, The Netherlands, October 14–17, 2001. Proceedings series: lecture notes in computer science, vol 2208. Springer, Berlin, pp 582–590

  2. Alyüz N, Gökberk B, Dibeklioğlu H, Savran A, Salah AA, Akarun L, Sankur B (2008) 3D face recognition benchmarks on the Bosphorus database with focus on facial expressions. In: Schouten B, Juul NC, Drygajlo A, Tistarelli M (eds) Biometrics and identity management, first European workshop, BIOID 2008, Roskilde, Denmark, May 7–9, 2008, lecture notes in computer science/image processing, computer vision, pattern recognition, and graphics (book 5372). Springer, Berlin, pp 57–66

  3. Calignano F, Vezzetti E (2010) Soft tissue diagnosis in maxillofacial surgery: a preliminary study on three-dimensional face geometrical features-based analysis. Aesthet Plast Surg 34(2):200–211

    Article  Google Scholar 

  4. Creusot C, Pears N, Austin J (2013) A machine-learning approach to keypoint detection and landmarking on 3D meshes. Int J Comput Vis 102(1–3):146–179

    Article  Google Scholar 

  5. D’Hose J, Colineau J, Bichon C, Dorizzi B (2007) Precise localization of landmarks on 3D faces using Gabor wavelets. In: First IEEE international conference on biometrics: theory, applications, and systems. IEEE, New York, pp 1–6

  6. Dibeklioğlu H, Salah AA, Akarun L (2008) 3D facial landmarking under expression, pose, and occlusion variations. In: Second IEEE international conference on biometrics: theory, applications and systems. IEEE, New York, pp 1–6

  7. Ekman P (1970) Universal facial expressions of emotions. Calif Mental Health Res Dig 8(4):151–158

    Google Scholar 

  8. Ekman P, Keltner D (1997) Facial expressions of emotions. Lawrence Erlbaum Associates, Mahwah, NJ

    Google Scholar 

  9. Frantz, S.; Rohr, K.;  Stiehl, H.-S.: Multi-Step Procedures for the Localization of 2D and 3D Point Landmarks and Automatic ROI Size Selection Proc. European Conf. on Computer Vision (ECCV'98), June 1998, Freiburg, Germany, Vol. I, Lecture Notes in Computer Science 1406, H. Burkhardt and B. Neumann (Eds.), Springer Berlin Heidelberg 1998, 687-703

  10. Frantz S, Rohr K, Stiehl HS (1999) Improving the detection performance in semi-automatic landmark extraction. In: Taylor C, Colchester A (eds) Medical image computing and computer-assisted intervention—MICCAI ’99, second international conference, Cambridge, UK, September 19–22, 1999, Proceedings series: lecture notes in computer science vol 1679. Springer, Berlin, pp 253–262

  11. Frantz S, Rohr K, Stiehl HS (2000) Localization of 3D anatomical point landmarks in 3D tomographic images using deformable models. In: Delp SL, DiGoia AM, Jaramaz B (eds) Medical image computing and computer-assisted intervention—MICCAI 2000, third international conference Pittsburgh, PA, USA, October 11–14, 2000, Proceedings series: lecture notes in computer science vol 1935. Springer, Berlin, pp 492–501

  12. Frantz S, Rohr K, Stiehl HS (2005) Development and validation of a multi-step approach to improved detection of 3D point landmarks in tomographic images. Image Vision Comput 23(11):956–971

    Article  Google Scholar 

  13. Heckbert PS (1986) Survey of texture mapping. Comput Graphics Appl 6(11):56–67

    Article  Google Scholar 

  14. Koenderink JJ and van Doorn AJ (1992) Surface shape and curvature scales. Image Vision Comput 10(8):557–564

  15. Perakis P, Palassis G, Theoharis T, Kakadiaris LA (2009) Automatic 3D facial region retrieval from multi-pose facial datasets. Eurographics 2009 workshop on 3D object retrieval, Munich, Germany, 29 March 2009, 30th annual conference of the European association for computer graphics

  16. Romero M, Pears N (2009) Landmark localization in 3D face data. In: 6th IEEE international conference on advanced video and signal based surveillance, Genova, Italy, 2–4 September 2009. IEEE, New York, pp 73–78

  17. Romero M, Pears N (2009) Point-pair descriptors for 3D facial landmark localization. IEEE 3rd international conference on biometrics: theory, applications, and systems. IEEE, New York, pp 1–6

  18. Ruiz MC, Illingworth J (2008) Automatic landmarking of faces in 3D - ALF3D. In: 5th international conference on visual information engineering, Xi’an, China, July 29–August 1, 2008, IET Conference Publication 543. Curran Associates, Red Hook, NY, pp 41–46

  19. Salah AA, Akarun L (2006) 3D facial feature localization for registration. In: Gunsel B, Jain AK, Tekalp AM, Sankur B (eds) Multimedia content representation, classification and security, international workshop, MRCS 2006, Istanbul, Turkey, September 11–13, 2006, Proceedings series: lecture notes in computer science, vol 4105. Springer, Berlin, pp 338–345

  20. Salah AA, Akarun L (2006) Gabor factor analysis for 2D + 3D facial landmark localization. In: IEEE 14th signal processing and communications applications. IEEE, New York, pp 1–4

  21. Salah AA, Çinar H, Akarun L, Sankur B (2007) Robust facial landmarking for registration. Ann Telecommun 62(1–2):83–108

    Google Scholar 

  22. Sang-Jun P, Dong-Won S (2008) 3D face recognition based on feature detection using active shape models. International conference on control, automation and systems, Seoul, Korea, 14–17 October 2008, pp 1881–1886

  23. Vezzetti E, Marcolin F, Stola V (2013) 3D human face soft tissues landmarking method: an advanced approach. Comput Ind 64(9):1326–1354

    Article  Google Scholar 

  24. Wörz S, Rohr K (2005) Localization of anatomical point landmarks in 3D medical images by fitting 3D parametric intensity models. Med Image Anal 10(1):41–58

    Article  Google Scholar 

  25. Zhang X, Wang Y, Pan G (2013) 3D facial landmark localization via a local surface descriptor HoSNI. In: Intelligent science and intelligent data engineering. Springer, Berlin, pp 313–321

Download references

Conflicts of interest

The authors have no conflicts of interest to disclose.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Federica Marcolin.

Appendix

Appendix

The first and second fundamental forms are used to measure distance on surfaces and are defined by

$$\begin{gathered} E{\text{d}}u^{2} + 2F{\text{d}}u{\text{d}}v + G{\text{d}}v^{2} , \hfill \\ e{\text{d}}u^{2} + 2f{\text{d}}u{\text{d}}v + g{\text{d}}v^{2} , \hfill \\ \end{gathered}$$

respectively, where E, F, G, e, f, and g are their coefficients. Curvatures are used to measure how a regular surface x bends in R 3. If D is the differential and N is the normal plane of a surface, then the determinant of DN is the product of the principal curvatures, (−k 1) (−k 2) = k 1 k 2, and the trace of DN is the negative of the sum of principal curvatures, −(k 1 + k 2). In point P, the determinant of DN P is the Gaussian curvature K of x at P. The negative of half of the trace of DN is called the mean curvature H of x at P. In terms of the principal curvatures, K and H can be written

$$\begin{gathered} K = k_{1} k_{2} , \hfill \\ H = \frac{{k_{1} + k_{2} }}{2}. \hfill \\ \end{gathered}$$

Some definitions of these descriptors are given. These are the forms implemented in the algorithm:

$$\begin{gathered} E = 1 + h_{x}^{2} , \hfill \\ F = h_{x} h_{y} , \hfill \\ G = 1 + h_{y}^{2} , \hfill \\ e = \frac{{h_{xx} }}{{\sqrt {1 + h_{x}^{2} + h_{y}^{2} } }}, \hfill \\ f = \frac{{ - h_{xy} }}{{\sqrt {1 + h_{x}^{2} + h_{y}^{2} } }}, \hfill \\ g = \frac{{ - h_{yy} }}{{\sqrt {1 + h_{x}^{2} + h_{y}^{2} } }}, \hfill \\ K = \frac{{h_{xx} h_{yy} - h_{xy}^{2} }}{{\left( {1 + h_{x}^{2} + h_{y}^{2} } \right)^{2} }}, \hfill \\ H = \frac{{\left( {1 + h_{x}^{2} } \right)h_{yy} - 2h_{x} h_{y} h_{xy} + \left( {1 + h_{y}^{2} } \right)h_{xx} }}{{\left( {1 + h_{x}^{2} + h_{y}^{2} } \right)^{3/2} }}, \hfill \\ k_{1} = H + \sqrt {H^{2} - K} , \hfill \\ k_{2} = H - \sqrt {H^{2} - K} , \hfill \\ \end{gathered}$$

where h is a differentiable function z = h(x,y). It is, therefore, convenient to have at hand formulas for the relevant concepts in this case. To obtain such formulas, let us parameterize the surface by

$$x\left( {u,v} \right) = \left( {u,v,h\left( {u,v} \right)} \right),\quad\left( {u,v} \right) \in U,$$

where u = x and v = y.

The most used descriptors are the shape and curvedness indexes S and C, introduced by Koenderink and Van Doorn [14]:

$$\begin{gathered} S = - \frac{2}{\pi }\arctan \frac{{k_{1} + k_{2} }}{{k_{1} - k_{2} }},\,S \in \left[ { - 1,1} \right],\,k_{1} \ge k_{2} , \hfill \\ C = \sqrt {\frac{{k_{1}^{2} + k_{2}^{2} }}{2}} . \hfill \\ \end{gathered}$$

For the role they play in the work, a little digression about their significance is needed. Their meaning is shown in Figs. 19, 20, 21 and in Table 1.

Fig. 19
figure 19

Illustration of the Shape Index scale divided into seven categories. Different subintervals of its range [–1, 1] correspond to seven geometric surfaces

Fig. 20
figure 20

Curvedness Index scale; range is (−∞, ∞)

Fig. 21
figure 21

Indexes (S, C) are viewed as polar coordinates in the (k 1, k 2) plane, with planar points mapped to the origin. The effects on surface structure from variations in the Curvedness Index (radial coordinate) and Shape Index (angular coordinate) parameters of curvature, and the relationship of these components to the principal curvatures k 1 and k 2. The degree of curvature increases radially from the center

Table 1 Topographic classes

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Vezzetti, E., Marcolin, F. 3D Landmarking in Multiexpression Face Analysis: A Preliminary Study on Eyebrows and Mouth. Aesth Plast Surg 38, 796–811 (2014). https://doi.org/10.1007/s00266-014-0334-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00266-014-0334-2

Keywords

Navigation