Emotions for game streaming

Emotionally intelligent peripherals for video game streamers and players

The broadcasting of original media through online platforms like YouTube or Twitch, referred to as “streaming”, is today more popular than traditional TV. Video game streamers for instance, although going through an emotionally rich playing experience face difficulties to restitute it to their viewers. Indeed, there are limited ways today to identify and convey precious moments of intense excitement other than focusing on game events like score.

With this project the partners will develop a new generation of emotion-enabled peripherals. It will deliver the first-of-its-kind real-time emotional monitoring system for game streaming, allowing for automatically augmenting exciting moments in live broadcasts or editing a digest video with such highlights. Both will help streamers to better engage with their community and stand out from their peers.

Deep multimodal learning will be used to identify emotional segments and predict theirintensity. Many games’ aficionados will annotate game segments emotionally while their multimodalreactions (physiological signals, body movements, facial expressions, speech) will be collected duringgaming. From this big data, the main challenges will be to build emotion recognition models, which cangeneralize to any user, combine several affective expressions, and considers the flow of emotions betweenusers.

Project in collaboration with Logitech.

Dr. Guillaume Chanel
Dr. Guillaume Chanel
Head of the SIMS group

Developping user sensing