Rémy Siegfried
About myself
I am from Saint-Maurice in Valais. I studied at EPFL where I obtained a Bachelor in Microengineering (2014) and then a Master (MSc) in Robotics and Autonomous Systems (2016). Then, I worked for 5 years at the Idiap Research institute, where I got my PhD (2021) under the supervision of Jean-Marc Odobez in the Electrical Engineering Doctoral Program (EDEE) of EPFL.
Current work and research interests
I work as a postdoctoral researcher at Idiap in the Perception and Activity Understanding group.
I am interested in machine learning and data analysis in general and more specifically in human perception and its application to human-computer/robot interactions and activity understanding.
Past projects
- 2021-2022 I worked on the P3 project (Innosuisse, SNSF) project with the HES-SO and a company that wanted to improve their feasibility study for new ordered parts and the optimization of process parameters. The outcome is a web interface that provide a tool to browse past produced parts using visual comparison and a model that predicts important processing variables depending on the process parameters.
- 2021-2022 In the frame of the NATAI project ("Agora" project, SNSF), I collaborated with the Musée de la Main (CHUV, Lausanne) and provide them a live gaze tracking demo for their exhibition on artificial intelligence. The result is an autonomous demonstrator that displays a scene, estimate what the user is looking at, and adapts the audio accordingly.
- 2018-2021 The main funding for my thesis comes from the MuMMER project ("Horizon2020", EU), where I worked on modeling and infering attention in human-robot interactions. Exploiting color and depth images as well as audio data, my goal was to estimate the individual attention of a group of people that is interacting with a robot in order to better understand conversations dynamic. I explored different topics and tasks, like unsupervised gaze estimation calibration, eye movements recognition, and attention estimation in arbitray settings.
- MuMMER project - Results in brief
- Paper presenting the lastest version of the robot system developed during the MuMMER project
- Demo of the Idiap perception module in the MuMMER project
- 2017 During my PhD, I was involved in the UBImpressed project ("Sinergia", SNSF), whose goal was to study the building of the first impression and its application in hospitality employees training. I mainly worked on gaze estimation and calibration.
- 2016 I did my Master Project at MOBOTS group (EPFL) under the supervision of Francesco Mondada. I worked in the field of learning analytics with mobile robots. I worked on methods that use the logs taken during a robot programming lecture to provide useful information to teachers and students in order to increase the learning outcome of lectures. I was then hired for 6 more monthes to continue my master project and develop a tool that provides online hints to students learning robotic programming based on the results of my master project.
- 2015 I worked during 7 monthes for senseFly (in Cheseaux-sur-Lausanne) on the motor control of their quadrotor and the development of a new camera interface (hardware) for a fixed-wing drone.
- 2014-2015 During my studies, I performed two semester projects: one on the implementation of safety behaviour on quadrotor formation (at DISAL (EPFL)) and a second on the design of legs for a quadruped robot (at BioRob (EPFL))
Material
- Code and data: Unsupervised gaze estimation calibration in conversation and manipulation settings (GitHub)
In our 2021 journal paper, we introduced a method to allow the unsupervised calibration of a gaze estimator using contextual prior based on top-down attention (i.e. related to the current task).
The data used in our experiments is available here: Idiap page Zenodo - ManiGaze dataset (Idiap page)
The ManiGaze dataset was created to evaluate gaze estimation from remote RGB and RGB-D (standard vision and depth) sensors in Human-Robot Interaction (HRI) settings, and more specifically during object manipulation tasks. The recording methodology was designed to let the user behave freely and encourage a natural interaction with the robot, as well as to automatically collect gaze targets, since a-posteriori annotation is almost impossible for gaze. - VFOA module (GitHub)
A python package for the basic visual focus of attention estimation of people in a 3D scene (geometrical and statistical models).
Publications
- Robust Unsupervised Gaze Calibration Using Conversation and Manipulation Attention Priors
R. Siegfried and J.-M. Odobez
ACM Transactions on Multimedia Computing, Communications, and Applications, Volume 18, Issue 1, January 2022 - Modeling and Inferring Attention between Humans or for Human-Robot Interactions
R. Siegfried
EPFL Thesis, 2021 - Visual Focus of Attention Estimation in 3D Scene with an Arbitrary Number of Targets
R. Siegfried and J.-M. Odobez
IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW-GAZE2021), Virtual, 2021 - ManiGaze: ManiGaze: a Dataset for Evaluating Remote Gaze Estimator in Object Manipulation Situations
R. Siegfried, B. Aminian and J.-M. Odobez
ACM Symposium on Eye Tracking Research & Applications (ETRA), Stuttgart, June, 2020 - MuMMER: Socially Intelligent Human-Robot Interaction in Public Spaces
M. E. Foster, O. Lemon, J.-M. Odobez, R. Alami, A. Mazel, M. Niemela, et al.
AAAI Fall Symposium on Artificial Intelligence for Human-Robot Interaction (AI-HRI), Arlington, November, 2019 (proceedings) - A Deep Learning Approach for Robust Head Pose Independent Eye Movements Recognition from Videos
R. Siegfried, Y. Yu and J.-M. Odobez
ACM Symposium on Eye Tracking Research & Applications (ETRA), Denver, June, 2019 - Facing Employers and Customers: What Do Gaze and Expressions Tell About Soft Skills?
S. Muralidhar, R. Siegfried, J.-M. Odobez and D. Gatica-Perez
International Conference on Mobile and Ubiquitous Multimedia (MUM), Cairo, November, 2018 - Towards the Use of Social Interaction Conventions As Prior for Gaze Model Adaptation
R. Siegfried, Y. Yu and J.-M. Odobez
ACM International Conference on Multimodal Interaction (ICMI), Glasgow, November 2017 - Supervised Gaze Bias Correction for Gaze Coding in Interactions
R. Siegfried and J.-M. Odobez
ECEM Communication by Gaze Interaction (COGAIN) Symposium, Wuppertal, August 2017 - Improved Mobile Robot Programming Performance through Real-time Program Assessment
R. Siegfried, S. Klinger, M. Gross, R. W. Sumner, F. Mondada and S. Magnenat
Annual Conference on Innovation and Technology in Computer Science Education (ITiCSE), Bologna, July 2017
Events and media
- 10.05.2022 - Talk entitled "Thinking without brain: the robotic logic" organised by Sciences Valais in Sion, in the frame of the international Pint of Science Festival
- 01.04.2022-31.03.2023 - Live gaze tracking demo for an exhibition on artificial intelligence at the Musée de la Main (CHUV, Lausanne) (article from 24heures, RTS news)
- 02.12.2021 - Public thesis defense - EPFL announcement
- 11.09.2021 - Idiap 30th anniversary - We presented a gaze tracking demo to show how a uesr can interact with a computer using the gaze
08.09.2020(postponed due to COVID-19) - Talk entitled "Thinking without brain: the robotic logic" organised by Sciences Valais in Sion, in the frame of the international Pint of Science Festival- 29.03.2019 - "Portrait de chercheur" made by Sciences Valais - YouTube video
- 29.08.2018 - Idiap's Innovation Days 2018 - article from Le Nouvelliste, video from 20 Minutes (0m55-1m40)
Contact
remy.siegfried@idiap.ch