Request a demo

Cinéma Émotif, the neuro-interactive cinema

Case Study

Problem - Traditional cinema keeps audiences passive, and interactive films often disrupt immersion; Cinéma Émotif faced challenges in capturing and integrating real-time emotions due to complex setup.

Solution - Cinéma Émotif uses hyperscanning EEG headsets and riemannian algorithms to monitor and dynamically alter the film's narrative, ensuring continuous immersion.

Result - Cinéma Émotif significantly enhances audience engagement, offering unique viewing experiences influenced by collective emotions.

Problem - Creating an Emotionally Interactive Film Experience

Traditional cinematic experiences confine audiences to passive roles, devoid of any sway over the unfolding narrative. Despite the emergence of interactive films, immersion often falters as viewers are compelled to make explicit choices, disrupting emotional engagement and narrative coherence.

In this landscape, the first iteration of Cinéma Émotif by Marie-Laure CAZIN, encountered a formidable challenge: developing a system capable of capturing and seamlessly incorporating real-time audience emotions into the film's storyline. This endeavor was further complicated by the need for a complex setup involving the precise calibration and synchronization of multiple EEG devices and analysis software.

These technical intricacies not only posed logistical hurdles but also raised questions about the feasibility of implementing such a system within the constraints of standard cinema environments. Additionally, ensuring that the audience's emotional responses were accurately reflected in the narrative progression presented an artistic and technical dilemma, requiring innovative solutions to maintain the integrity of the viewing experience.

Solution - Seamless Emotional Integration into Film Narrative

Cinéma Émotif, supported by SCRIME, LaBRI, Acce)s( and Mentalista, tackles this challenge head-on by harnessing the power of neurotechnology to craft an emotionally responsive cinematic experience. Through the utilization of EEG headsets worn by the audience, their emotional responses are monitored and analyzed in real-time.

The project is based on the pilot film "Mademoiselle Paradis" (2014), directed by Marie-Laure Cazin, which narrates the story of Mesmer and his patient Mademoiselle Paradis in 1777, in Vienna. The twenty-minute pilot presents several possible sequences to create 12 different scenarios.

This wealth of emotional data serves as the driving force behind dynamically altering both the film's narrative and its soundtrack, facilitating a seamless and uninterrupted journey for viewers. The implementation of hyperscanning EEG headsets, coupled with the refinement of emotional analysis algorithms with a riemannian approach, significantly enhances the system's ability for real-time processing, ensuring swift and accurate adaptation to audience sentiment.

Result - Enhanced Audience Engagement and Interaction

The implementation of Cinéma Émotif marks a significant leap forward in audience interaction and engagement within cinematic realms. Seamlessly attuned to the collective emotional pulse of viewers, the film delivers a tailor-made and immersive experience on every occasion.

This technological breakthrough not only enriches the storytelling experience but also revolutionizes the audience-film dynamic, fostering a deeper and more meaningful connection. The outcome of these advancements is a robust and adaptable system poised for seamless integration into diverse cinema settings.

Empowered to accommodate a larger number of participants, Cinéma Émotif engenders an emotionally immersive environment, where the film's narrative pathways are directly shaped by the collective emotional tapestry of the audience. Each screening of "Mademoiselle Paradis" unfolds as a distinct and personalized journey, ensuring a truly unforgettable cinematic encounter.

Beyond traditional cinemas, this technology holds potential for home use in video streaming and TV. By integrating the system into personal entertainment setups, viewers can experience emotionally responsive films from the comfort of their homes. Moreover, this approach opens new avenues for interactive storytelling in various media formats, including gaming and virtual reality experiences, where real-time emotional feedback can significantly enhance user immersion and engagement.

Featured products

Rosette (M04), Mental SDK

Project Development Time

18 month

Want to start working with brain-environment interfaces?

Contact us →


Receive the latest hardware & software information, as well as invites to Mentalista events, workshops and beta trials.

You can find out more on how we process your personal data and how to exercise your rights by reading our Privacy Policy.