Abstract
Recently, deep learning based facial landmark detection has achieved great success. Despite this, we notice that the semantic ambiguity greatly degrades the detection performance. Specifically, the semantic ambiguity means that some landmarks (e.g. those evenly distributed along the face contour) do not have clear and accurate definition, causing inconsistent annotations by annotators. Accordingly, these inconsistent annotations, which are usually provided by public databases, commonly work as the ground-truth to supervise network training, leading to the degraded accuracy. To our knowledge, little research has investigated this problem. In this paper, we propose a novel probabilistic model which introduces a latent variable, i.e. the 'real' ground-truth which is semantically consistent, to optimize. This framework couples two parts (1) training landmark detection CNN and (2) searching the 'real' ground-truth. These two parts are alternatively optimized: the searched 'real' ground-truth supervises the CNN training; and the trained CNN assists the searching of 'real' ground-truth. In addition, to recover the unconfidently predicted landmarks due to occlusion and low quality, we propose a global heatmap correction unit (GHCU) to correct outliers by considering the global face shape as a constraint. Extensive experiments on both image-based (300W and AFLW) and video-based (300-VW) databases demonstrate that our method effectively improves the landmark detection accuracy and achieves the state of the art performance.
Abstract (translated)
近年来,基于深度学习的人脸地标检测取得了很大的成功。尽管如此,我们注意到语义模糊性大大降低了检测性能。具体来说,语义歧义意味着某些标志(例如,沿面部轮廓均匀分布的标志)没有清晰准确的定义,从而导致注释员的注释不一致。因此,这些不一致的注释通常由公共数据库提供,通常作为基础事实来监督网络培训,导致准确性下降。据我们所知,很少有研究调查过这个问题。在本文中,我们提出了一个新的概率模型,引入一个潜在变量,即语义一致的“真实”基础真相来优化。这个框架将两部分结合起来:(1)培训地标探测CNN;(2)搜索“真实”的地面真相。这两个部分是交替优化:搜索的“真实”地面真相监督CNN的培训;培训的CNN协助搜索“真实”地面真相。此外,为了恢复遮挡和低质量导致的不确定预测地标,我们提出了一种全局热图校正单元(GHCU),将全局人脸形状作为约束条件来校正异常值。对基于图像的(300W和AFLW)和基于视频的(300-VW)数据库进行了大量的实验,证明了我们的方法有效地提高了地标检测的精度,并达到了最先进的性能。
URL
https://arxiv.org/abs/1903.10661