Fake faces for more ethics in biometrics
Most of us have their picture on the internet. Researchers or software designers who want to use them to create a facial recognition tool will face numerous scientific and ethical problems. “To avoid biases, researchers not only need a vast amount of various pictures, but also a balanced one in terms of people’s gender, age or cultures,” Sébastien Marcel, head of the Biometrics and privacy research group at Idiap. “It’s the main challenge: these sets of pictures rarely represent population’s diversity, and when they do, it’s often impossible to use them for another research project due to the personal data protection regulations.” Thanks to the financial support of the Hasler Foundation, the SAFER project will create faces of people who are not real and which can be used to develop ethical face recognition tools.
Open source and technology transfer
The evolution of the legal framework, especially at the European level, and the reluctance even from big tech companies to create biometrics databases are quite challenging aspects for the research in biometrics. By including several partners and by conducting open and replicable research, scientists aim to create a news scientific standard. They hope that, in the future, this approach will also be useful for other domains in biometrics, such as vocal recognition, fingerprints, etc. and more broadly in domains that use machine learning.
Beyond academia, this project will also integrate from its very beginning an industrial partner, SICPA. The company will play a key role in testing, evaluating and using the software, the database generating methods and the databases themselves, developed by Idiap and the University of Zurich. This collaboration will ensure that the research is not only replicable, but also usable in practice.
Planned for the next three years, the project includes a PhD student at Idiap and another at the University of Zurich.
More information
- Biometrics and privacy research group