Content-based Video Browsing
Research demonstrated that browsing is an effective way for content-based shot retrieval, particularly when exact queries are hard to form. Current browsing technologies are based either on temporal neighbor, or visual similarity. We proposed a new browsing technology: Visual Neighbor Similarity (VNS) browsing, which integrated both temporal distance and visual similarity factors.
Temporal neighbor browsing allows users to navigate around the selected sample shot keyframe because potentially relevant shots may appear just before or after the sample. Visual similarity browsing allows users to navigate keyframes that have "similar" visual features (e.g., color and layout) as the sample. VNS browsing is a balance of the visual similarity and temporal distance. We used the traditional color histogram algorithm to measure the visual similarity (V). For the sake of simplicity, we select the 1/x function as the neighboring distance (N), where x denotes the distance (numbers of shots) between a keyframe and the sample keyframe. The score of relevance (R) can thus be represented as
R = _1V+ _2N
Here _1 and _2 are weighting factors and _1+ _2=1.
Based on the values of R, a number of "related" keyframes were then selected and can be presented as a filmstrip for navigation.
A new video navigation system (Figure 1) is developed as the platform for evaluation.An online demo is available (http://pc-43-162.slis.uwm.edu/projects/videonavigator/).
© Copyright 2007 Xiangming Mu