Defesa de doutorado do discente Pedro Henrique Lopes, dia 17/06/22 as 14:30.

Defesa de doutorado do discente Pedro Henrique Lopes, dia 17/06/22 as 14:30.

Título: Rethink and Redesign\\Deep Learning for Biometrics
Data: 17/06/2022    Hora: 14:30
Link de transmissão: https://meet.google.com/pwq-kvjn-ymn.
Banca: Prof. Dr. Luciano Rebouças (UFBA); Prof. Dr. Rafael Queiroz (UFOP); Prof. Dr. Rodrigo Pedrosa (UFOP); Prof. Dr. Thiago Santos (UFES); Prof. Dr. Gladston Moreira (UFOP); Prof. Dr. Eduardo Luz (UFOP)
Abstract: The biometric system is a common subject in a human being's daily life. The efforts to enhance the security of those systems are increasing every year due to their need for robustness. However, the system based on a biometric modality does not have a performance close to perfection in non-cooperative environments compared to cooperative ones, which results in more complexity in the approaches proposed for those scenarios. Due to that fact, new studies are developed to enhance biometric-based systems performance by creating new manners of teaching a machine learning algorithm and new representations. Currently, several researchers are directing their efforts to develop new metric learning approaches for deep learning architectures for a wide range of problems, including biometrics. In this work, we propose a biometric-based loss function to create deep representations to be used in biometric systems. It is called D-loss. A different strategy to enhance system robustness is to fusion two or more biometric modalities. However, it is impossible to find a dataset with all possible biometric combinations. A simple solution is to create a synthetic one, although the methodology to create one is still an open problem in the literature. In this work, a criterion is proposed to merge two or more modalities while keeping the reproducibility aspect: the Doddington Zoo criteria. Several merging strategies are evaluated: fusions at score level (minimal, multiplication, and sum) and feature level (simple concatenation and metric learning). The results show the effectiveness of the proposed loss with the smallest EER of 5.38%, 13.01%, and 7.96% for MNIST-Fashion, CIFAR-10, and CASIA-V4. An EER close to zero is also observed using the merging criteria proposed and the ECG, eye, and face modalities. Two datasets with over 1,000 individuals are used to evaluate the merging criteria along with the D-loss and other metric learning losses. The results show the aspect of the Doddington Zoo criteria of creating similar datasets (reproducible) and the robustness of the D-loss.

Departamento de Computação  |  ICEB  |  Universidade Federal de Ouro Preto
Campus Universitário Morro do Cruzeiro  |  CEP 35400-000  |  Ouro Preto - MG, Brasil
Telefone: +55 31 3559-1692  |  decom@ufop.edu.br