A pilot study for mood-based classification of TV programmes

  • Authors:
  • Jana Eggink;Penelope Allen;Denise Bland

  • Affiliations:
  • BBC R&D, London, UK;BBC R&D, Dock House, MediaCityUK, Salford, UK;BBC R&D, London, UK

  • Venue:
  • Proceedings of the 27th Annual ACM Symposium on Applied Computing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We report results from a pilot study on mood-based classification of TV programmes. Short video clips from various programmes were labelled on three mood axes, giving the subjects opinion of how happy, serious and exciting each clip was. This data was used for mood classification based on automatically extracted audio and video features in a machine learning framework. Attention was given to the challenges of dealing with a small dataset as commonly obtained from pilot studies, showing that a thorough evaluation was possible and produced useful results. Introducing a new feature based on face detection and combining it with other signal processing features led to good classification accuracies. These lay between 85% and 100% for the most simple setting, and still reached more than 70% accuracy when a finer three point mood scale was used. Overall, the results were promising and showed that automatic mood classification of video material is possible. Moods can therefore be used as additional metadata to facilitate search in large archives.