Niko Neufeld – LHCb experiment CERN


Niko Neufeld on ISC 2018

Speaking Slot: 27 June 2018
13:20 till 13:40 in booth N-210

CERN’s Reply to the Computing and Storage Challenges related to the Large Hadron Collider’s (LHCb) Upgrade Program: Prefabricated Datacenter Modules by Automation – Where performance meets efficiency

CERN is the renowned European Organization for Nuclear Research. It employs just over 2500 people. The laboratory’s scientific and technical staff design and build the particle accelerators and ensure their smooth operation. They also help to prepare, run, analyse and interpret data from complex scientific experiments.

As of 2021 the ALICE O2 facility will be a high-throughput computing system which will include heterogeneous computing platforms and disk storage systems.
The upgraded LHCb experiment, scheduled to start taking data in 2021, will collect data at higher luminosity compared to the present running conditions and with a significantly increased selection efficiency.

Therefore the datacenters needed to be extended keeping in mind the extra capacity, limited cost and short delivery time.

The ALICE and LHCb experiments at CERN have decided not to build new or additional (traditional) datacenters, but to use Prefabricated Datacenter Modules to carry out the upgrades instead.

Why: The advantages of Automation’s Prefabricated Datacenter Modules are completely in line with their requirements:

  • Fast, flexible and efficient solution to achieve the highest quality standards
  • Supports the high performance requirements of CERN’s HPC
  • Factory build lead-time of 12 weeks
  • Performance configurations optimized for power and density, energy efficient
  • ‘Near source’: close to the experiment

Moreover, when consulting the ‘CERN openlab White Paper of September 2017 on Future ICT Challenges in Scientific Research’ – Page 5: R&D Topic 1: Data-Centre Technologies and Infrastructures, Automation is completely aligned. The same goes with the Keynote Speech at ISC of Dr. Maria Girone – CERN Openlab Chief Technology Officer on 25/6 explaining ‘Tackling tomorrow’s computing challenges today at CERN’.

R&D Topic 1: Data-Centre Technologies and Infrastructures

Designing and operating distributed data infrastructures and computing centres poses challenges in areas such as networking, architecture, storage, databases, and cloud. These challenges are amplified and added to when operating at the extremely large scales required by major scientific endeavours.

CERN is evaluating different models for increasing computing and data-storage capacity, in order to accommodate the growing needs of the LHC experiments over the next decade. All models present different technological challenges. In addition to increasing the capacity of the systems used for traditional types of data processing and storage, explorations are being carried out into a number of alternative architectures and specialized capabilities. These will add heterogeneity and flexibility to the data centres, and should enable advances in resource optimization.


CERN openlab White Paper of September 2017 on Future ICT Challenges in Scientific Research’ – Page 5.