The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data-taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing A layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure and the current development of the CMS analysis system.
Distributed Computing Grid Experiences in CMS / Andreeva, J.; Anjum, A.; Barrass, T.; Bonacorsi, D.; Bunn, J.; Capiluppi, P.; Corvo, M.; Darmenov, N.; De Filippis, N.; Donno, F.; Donvito, G.; Eulisse, G.; Fanfani, A.; Fanzago, F.; Filine, A.; Grandi, C.; Hernández, J. M.; Innocente, V.; Jan, A.; Lacaprara, S.; Legrand, I.; Metson, S.; Newman, H.; Newbold, D.; Pierro, A.; Silvestris, L.; Steenberg, C.; Stockinger, H.; Taylor, L.; Thomas, M.; Tuura, L.; Wildish, T.; Van Lingen, F.. - In: IEEE TRANSACTIONS ON NUCLEAR SCIENCE. - ISSN 0018-9499. - 52:4(2005), pp. 884-890. [10.1109/TNS.2005.852755]
Distributed Computing Grid Experiences in CMS
N. De Filippis;
2005-01-01
Abstract
The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data-taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing A layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure and the current development of the CMS analysis system.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.