Video database management systems require efficient methods to abstract video information. Identification of shots in a video sequence is an important task for summarizing the content of a video. We describe a neural network based technique for automatic clustering of video frames in video sequences. From each frame the features that describe the image content are extracted to form a signature. These signatures are clustered using a rival penalized competitive learning (RPCL) neural network owing its capability to being able to automatically detect the number of classes in the data set. Results presented in the paper show that for images clustering in video sequences, the RPCL network is able to automatically extract the correct number of classes, hence the correct number of scenes, and to produce a class partition which agrees with a human model of sequences.

Scene segmentation in video sequences by an RPCL neural network

DI LECCE, Vincenzo;A. Guerriero
1998

Abstract

Video database management systems require efficient methods to abstract video information. Identification of shots in a video sequence is an important task for summarizing the content of a video. We describe a neural network based technique for automatic clustering of video frames in video sequences. From each frame the features that describe the image content are extracted to form a signature. These signatures are clustered using a rival penalized competitive learning (RPCL) neural network owing its capability to being able to automatically detect the number of classes in the data set. Results presented in the paper show that for images clustering in video sequences, the RPCL network is able to automatically extract the correct number of classes, hence the correct number of scenes, and to produce a class partition which agrees with a human model of sequences.
IEEE World Congress on Computational Intelligence
0-7803-4859-1
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/11589/14072
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 0
social impact