The CMS collaboration is undertaking a big effort to define the analysis model and to develop software tools with the purpose of analysing several millions of simulated and real data events by a large number of people in many geographically distributed sites. From the computing point of view, one of the most complex issues when doing remote analysis is the data discovery and access. Some software tools were developed in order to move data, make them available to the full international community and validate them for the subsequent analysis. The batch analysis processing is performed with workload management tools developed on purpose, which are mainly responsible for the job preparation and the job submission. The job monitoring and the output management are implemented as the last part of the analysis chain. Grid tools provided by the LCG project are evaluated to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the analysis jobs. An overview of the current implementation and of the interactions between the previous components of the CMS analysis system is presented in this work.
|Titolo:||The CMS analysis chain in a distributed environment|
|Data di pubblicazione:||2006|
|Digital Object Identifier (DOI):||10.1016/j.nima.2005.11.124|
|Appare nelle tipologie:||1.1 Articolo in rivista|