IEEE TCDL Bulletin
 
space space

TCDL Bulletin
Volume 3   Issue 2
Summer 2007

 

Content-based Video Browsing

Xiangming Mu
University of Wisconsin-Milwaukee
3210 N. Maryland Ave., Milwaukee, WI 53211
(414)-229-6039
<mux@uwm.edu>

 

Research demonstrated that browsing is an effective way for content-based shot retrieval, particularly when exact queries are hard to form. Current browsing technologies are based either on temporal neighbor, or visual similarity. We proposed a new browsing technology: Visual Neighbor Similarity (VNS) browsing, which integrated both temporal distance and visual similarity factors.

Temporal neighbor browsing allows users to navigate around the selected sample shot keyframe because potentially relevant shots may appear just before or after the sample. Visual similarity browsing allows users to navigate keyframes that have "similar" visual features (e.g., color and layout) as the sample. VNS browsing is a balance of the visual similarity and temporal distance. We used the traditional color histogram algorithm to measure the visual similarity (V). For the sake of simplicity, we select the 1/x function as the neighboring distance (N), where x denotes the distance (numbers of shots) between a keyframe and the sample keyframe. The score of relevance (R) can thus be represented as

R = _1V+ _2N

Here _1 and _2 are weighting factors and _1+ _2=1.

Based on the values of R, a number of "related" keyframes were then selected and can be presented as a filmstrip for navigation.

A new video navigation system (Figure 1) is developed as the platform for evaluation.

An online demo is available (http://pc-43-162.slis.uwm.edu/projects/videonavigator/).

Thumbnail image of poster

For a larger view of Figure 1, click here.

 

© Copyright 2007 Xiangming Mu
Some or all of these materials were previously published in the Proceedings of the 6th ACM/IEEE-CS Joint Conference on Digital libraries, ACM 1-59593-354-9.

Top | Contents
Previous Article
Next Article
Home | E-mail the Editor