Search results

1 – 1 of 1
Article
Publication date: 9 August 2011

Min Gyo Chung, Taehyung (George) Wang and Phillip C.‐Y. Sheu

Video summarisation is one of the most active fields in content‐based video retrieval research. A new video summarisation scheme is proposed by this paper based on socially…

Abstract

Purpose

Video summarisation is one of the most active fields in content‐based video retrieval research. A new video summarisation scheme is proposed by this paper based on socially generated temporal tags.

Design/methodology/approach

To capture users' collaborative tagging activities the proposed scheme maintains video bookmarks, which contain some temporal or positional information about videos, such as relative time codes or byte offsets. For each video all the video bookmarks collected from users are then statistically analysed in order to extract some meaningful key frames (the video equivalent of keywords), which collectively constitute the summary of the video.

Findings

Compared with traditional video summarisation methods that use low‐level audio‐visual features, the proposed method is based on users' high‐level collaborative activities, and thus can produce semantically more important summaries than existing methods.

Research limitations/implications

It is assumed that the video frames around the bookmarks inserted by users are informative and representative, and therefore can be used as good sources for summarising videos.

Originality/value

Folksonomy, commonly called collaborative tagging, is a Web 2.0 method for users to freely annotate shared information resources with keywords. It has mostly been used for collaboratively tagging photos (Flickr), web site bookmarks (Del.icio.us), or blog posts (Technorati), but has never been applied to the field of automatic video summarisation. It is believed that this is the first attempt to utilise users' high‐level collaborative tagging activities, instead of low‐level audio‐visual features, for video summarisation.

Details

Online Information Review, vol. 35 no. 4
Type: Research Article
ISSN: 1468-4527

Keywords

1 – 1 of 1