Multimodal and emotional gaming smartband

Multimodal emotion recognition for adaptive gaming based on a smartband end-to-end solution

In today’s world, wearable technology has become a rapidly growing field. However, the quality of signals collected by these biosensors remains a major challenge. Current wearable technologies put a strong emphasis on health and wellness although they could be used in other domains such as entertainment. This proposal focuses on the measure of users' experience and emotions in the context of video games.

Although physiological emotion recognition devices achieve reasonable accuracy in the laboratory it is less the case in an ecological setting. This can be explained by two factors: noise from biosensor measurements and missing contextual information. Consequently, the project objective is to improve the quality of signals collected by the Ovomind biosensors and to determine video games’ contextual information automatically.

Multimodal deep learning methods will be developed to (i) identify and correct noisy physiological signals, and (ii) identify context from game videos. This approach will be validated by measuring the impact of both methods on emotion recognition performance. A sufficient quantity of data will be obtained by combining existing databases with data collected during the project from gamers playing their games at home.

This scientific partnership will create value by contributing to advancing wearable technology and improving the reliability of biosensing Ovmonind smartbands. The project will produce new insights and breakthroughs that will drive innovation by providing an emotional software development kit for game developers, so they can create a new generation of video games which enhance players’ experience. The project will also have the potential to generate significant commercial benefits, such as increased revenue and market share, by enabling the development of new

Project in collaboration with Ovomind.

Dr. Guillaume Chanel
Dr. Guillaume Chanel
Head of the SIMS group

Developping user sensing