The CMS collaboration is undertaking a big effort to define the analysis model and to develop software tools with the purpose of analysing several millions of simulated and real data events by a large number of people in many geographically distributed sites. From the computing point of view, one of the most complex issues when doing remote analysis is the data discovery and access. Some software tools were developed in order to move data, make them available to the full international community and validate them for the subsequent analysis. The batch analysis processing is performed with workload management tools developed on purpose, which are mainly responsible for the job preparation and the job submission. The job monitoring and the output management are implemented as the last part of the analysis chain. Grid tools provided by the LCG project are evaluated to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the analysis jobs. An overview of the current implementation and of the interactions between the previous components of the CMS analysis system is presented in this work.

The CMS analysis chain in a distributed environment / Barrass, T.; Bonacorsi, D.; Ciraolo, G.; Corvo, M.; DE FILIPPIS, Nicola; Donvito, G.; Faina, L.; Fanfani, A.; Fanzago, F.; Grandi, C.; Innocente, V.; Lacaprara, S.; Maggi, Giorgio Pietro; Maggi, M.; Pierro, A.; Silvestris, L.; Spiga, D.; Taylor, L.; Tuura, L.; Wildish, T.. - In: NUCLEAR INSTRUMENTS & METHODS IN PHYSICS RESEARCH. SECTION A, ACCELERATORS, SPECTROMETERS, DETECTORS AND ASSOCIATED EQUIPMENT. - ISSN 0168-9002. - 559:1(2006), pp. 38-42. [10.1016/j.nima.2005.11.124]

The CMS analysis chain in a distributed environment

DE FILIPPIS, Nicola;MAGGI, Giorgio Pietro;
2006-01-01

Abstract

The CMS collaboration is undertaking a big effort to define the analysis model and to develop software tools with the purpose of analysing several millions of simulated and real data events by a large number of people in many geographically distributed sites. From the computing point of view, one of the most complex issues when doing remote analysis is the data discovery and access. Some software tools were developed in order to move data, make them available to the full international community and validate them for the subsequent analysis. The batch analysis processing is performed with workload management tools developed on purpose, which are mainly responsible for the job preparation and the job submission. The job monitoring and the output management are implemented as the last part of the analysis chain. Grid tools provided by the LCG project are evaluated to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the analysis jobs. An overview of the current implementation and of the interactions between the previous components of the CMS analysis system is presented in this work.
2006
The CMS analysis chain in a distributed environment / Barrass, T.; Bonacorsi, D.; Ciraolo, G.; Corvo, M.; DE FILIPPIS, Nicola; Donvito, G.; Faina, L.; Fanfani, A.; Fanzago, F.; Grandi, C.; Innocente, V.; Lacaprara, S.; Maggi, Giorgio Pietro; Maggi, M.; Pierro, A.; Silvestris, L.; Spiga, D.; Taylor, L.; Tuura, L.; Wildish, T.. - In: NUCLEAR INSTRUMENTS & METHODS IN PHYSICS RESEARCH. SECTION A, ACCELERATORS, SPECTROMETERS, DETECTORS AND ASSOCIATED EQUIPMENT. - ISSN 0168-9002. - 559:1(2006), pp. 38-42. [10.1016/j.nima.2005.11.124]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11589/2023
Citazioni
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 0
social impact