Novel video retrieval system
Unlike a collection of images, video has the advantage of offering uninterrupted scenes spanning time and space. In spite of this, information is completely embedded within raw data and is not organised in an efficient manner making it difficult to retrieve. In light of this, the VIBES project ventured to turn video into a data type that is searchable by content and that can be annotated, hyper-linked and edited just like text. The developed tools allow cut detection, indexing, synthesis and classification of scenes that are non-static and non-rigid. Through the advantage of rapid browsing and retrieval, a video can be enhanced with hyperlinks that can link shots to an actor, type of action or scene. This enables more efficient video browsing. Additionally with 3D scene synthesis and human animation models, virtual reality environments can be created for particular shots. A demonstration may be viewed by clicking at: http://www.robots.ox.ac.uk/~vgg/research/vgoogle/