The concept of emotional engagement is pretty straightforward. Consumers have an “ideal” image of every product and service – including entertainment and experiential events – and particularly, movies. Ultimately, emotional engagement is the yardstick consumers use to measure brands and entertainment. But defining the category’s ideal gets very tricky for one particular reason.
To define a category ideal accurately, one needs below-the-radar psychological metrics because today’s consumer does not behave as they say; does not say what they really think; and does not think what they feel. So a 10-point scale just won’t do it anymore!
Today, it’s all about emotional engagement, with the emphasis on “emotional.” Consumers talk to themselves before they talk to brands. They’re hot-wired to social networking, which supercharges expectations for the category being “shared.” The result? Massive gaps between what people really want and what brands/entertainment/experiential events deliver.
The ideal, of course, is not static. It changes according to how a consumer values the category change. Or, in the case of movies, the particular category the movie, actor, or director falls into. And because emotional values are the main drivers today, changes to the ideal – and how well something meets that ideal – are predictive of how consumers will behave. In the case of movies, it’s a matter of how the audience reacts to them, tweets about them, and shares them with friends and family.
Testing Oscar Nominees’ Emotional Engagement
This year’s roster of Academy Award nominations were put to the emotional engagement test. Results below indicate the degree to which the nominees lived up to the audiences’ ideals, translated into odds:
Best Picture
La La Land | 1/6 |
Moonlight | 6/1 |
Hidden Figures | 10/1 |
Manchester By The Sea | 16/1 |
Fences | 50/1 |
Hacksaw Ridge | 60/1 |
Arrival | 80/1 |
Lion | 80/1 |
Hell or High Water | 100/1 |
Best Actor
Casey Affleck | 4/9 |
Denzel Washington | 3/2 |
Ryan Gosling | 12/1 |
Andrew Garfield | 30/1 |
Viggo Mortensen | 100/1 |
Best Actress
Emma Stone | 1/6 |
Natalie Portman | 4/1 |
Ruth Negga | 40/1 |
Meryl Streep | 50/1 |
Supporting Actor
Mahershala Ali | 1/10 |
Jeff Bridges | 11/1 |
Michael Shannon | 12/1 |
Lucas Hedges | 15/1 |
Dev Patel | 15/1 |
Supporting Actress
Viola Davis | 1/25 |
Michelle Williams | 9/1 |
Naomie Harris | 15/1 |
Nicole Kidman | 25/1 |
Octavia Spencer | 50/1 |
Best Director
Damiel Chazelle | 1/10 |
Kenneth Lonergan | 7/1 |
Barry Jenkins | 8/1 |
Denis Villeneuve | 50/1 |
What consumers expected is expressed as index numbers and is configured versus a category benchmark of 100. These assessments not only identify the ideal, but also allow us to measure the degree to which the nominees met consumer expectations for the path-to-purchase (or in this case, the path-to-picture) driver that defines the ideal.
The research technique, a combination of psychological inquiry, and higher-order statistical analyses has a test/re-test reliability of 0.93, accounts for 96% of the variance in a category, and provides results generalizable at the 95% confidence level.
Please note the odds are for entertainment value and engagement diagnostics only. If you’re looking to engage in some moneymaking outcomes, it’s generally a bad idea to bet against emotional engagement in any category because it’s predictive of how people will behave in the marketplace.
Or in this case, the movie theatre.
Image: Davidlohr Bueso