z-logo
open-access-imgOpen Access
Evaluation of the computing resources required for a Nordic research exploitation of the LHC
Author(s) -
Christina Zacharatou Jarlskog,
Sverker Almehed,
C. Driouichi,
P. Eerola,
U. Mjörnmark,
O. Smirnova,
T. P. A. Åkesson
Publication year - 2001
Publication title -
proceedings of international europhysics conference on high energy physics — pos(hep2005)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.22323/1.007.0277
Subject(s) - large hadron collider , computer science , yard , operations research , particle physics , engineering , physics , quantum mechanics
A simulation study to evaluate the required computing resources for a research exploitation of the Large Hadron Collider (LHC) has been performed. The evaluation was done as a case study, assuming existence of a Nordic regional centre and using the requirements for performing a specific physics analysis as a yard-stick. Other input parameters were: assumption for the distribution of researchers at the institutions involved, an analysis model, and two different functional structures of the computing resources. 1. A case-study for the LHC data processing model The Large Hadron Collider (LHC) is the world’s biggest accelerator, being built at the European Particle Physics Laboratory, CERN. By the time of its completion in 2006, it will be capable of accelerating and colliding beams of protons at centre of mass energies of 14 TeV. LHC experiments expect to have a recorded raw data rate of about 1 PetaByte per year at the beginning of the LHC operation [1]. The geographical spread of the collaboration increases the complexity of the access and analysis of the data. One of the important measurements at LHC will be that of the parameter sin(2β), where the angle β is an angle of the CKM unitarity triangle describing quark mixing. It is a central parameter to demonstrate CP violation in the B-meson system [2]. In order to measure this parameter one needs to reconstruct and tag decays Bd → J/ψKS. In addition to the signal events, one also needs to analyze control samples (J/ψK∗ and J/ψK+ events). The ATLAS detector [1] will trigger such events at the low-luminosity running of LHC. ∗Speaker. †On leave from JINR, 141980 Dubna, Russia. P r H E P h e p 2 0 0 1 International Europhysics Conference on HEP Christina Zacharatou Jarlskog The high-level trigger [3], the event filter, consists of full event processing with off-line type software. The final event filter output rate of J/ψ decaying to μ+μ− or e+e− was estimated to give, in one day of data taking, a total of 3.1·106 di-lepton events, approximately [1]. In the following, it is assumed that these events will be written out in a separate raw data stream. In order to cope with the LHC data analysis and storage requirements, a tiered hierarchy of distributed regional centres was proposed by the MONARC project (Models of Networked Analysis at Regional Centres for LHC Experiments) [4]. In this scheme, the main centre is CERN (Tier0), where the data reconstruction is expected to take place. The Tier1 centres have a capacity next largest to CERN. Among the possible activities of the Tier1’s, the production and reconstruction of fully-simulated data requires significant resources. The data analysis and fast simulation will be mainly the responsibility of the Tier2 centres of a smaller capacity. Computer farms at institutions and workstations constitute lower tiers. Data recorded directly from the online stream, including the signals from the detector elements and the on-line reconstruction results (called “raw data” or RAW), are expected to reside at CERN, being stored on a mass storage. During the processing at this Tier0 centre, the raw data will first be run through a reconstruction program, which calculates charged particle trajectories and energy depositions in the calorimeters. The reconstruction results are called ESD (Event Summary Data). Further processing algorithms are used at the Tier0 to prepare AOD (Analysis Object Data), which contain reduced information from ESD, and “tag data” or TAG, which are a small set of variables describing the event. The information in the TAG data set is meant to be used for initial selection of the AOD data to be analyzed. The size of these data types per event is expected to be 1 MB for RAW, 0.1 MB for ESD, 0.01 MB for AOD and 0.001 MB for TAG. The MONARC project developed a simulation tool [5] to model various configurations of regional centres. It allows to determine optimal resources and strategies needed to achieve the highest efficiency of tasks performed by users. In this paper, a MONARC simulation study to evaluate the required computing resources in the Nordic countries for a research exploitation of the LHC has been performed, using the measurement of sin(2β) as one of the several physics cases. The simulations addressed the processing and analysis required for one day of data-taking, which includes (a) reconstruction of RAW data (production of ESD, AOD and TAG data) at Tier0 (CERN), (b) analysis of AOD data at a Nordic Tier2 (or Tier1), (c) fast simulation of AOD data at a Nordic Tier2 (or Tier1) and (d) full simulation and reconstruction of RAW data at a Nordic Tier1. The amount of data was assumed to correspond to one day of data-taking, i.e. about 3 million real data events and 6 million simulated events. It was assumed that this specific analysis will be conducted by four experimental groups in the Nordic countries: at the Niels Bohr Institute (NBI) in Copenhagen, at the University of Oslo, at the University of Bergen and at the University of Lund. Each experimental group was assumed to consist of five researchers. All the CPU power was placed in NBI, which was considered both in a Tier1 and in a Tier2 configuration. The other three institutes represent users of the computing power of NBI. When the NBI was considered to be a Tier2 centre, the full simulation was assumed to be performed in three Tier1 centres, producing two million events each. It was

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom