MarketATLAS experiment
Company Profile

ATLAS experiment

ATLAS is the largest general-purpose particle detector experiment at the Large Hadron Collider (LHC), a particle accelerator at CERN in Switzerland. The experiment is designed to take advantage of the unprecedented energy available at the LHC and observe phenomena that involve highly massive particles which were not observable using earlier lower-energy accelerators. ATLAS was one of the two LHC experiments involved in the discovery of the Higgs boson in July 2012. It was also designed to search for evidence of theories of particle physics beyond the Standard Model.

History
Particle accelerator growth The first cyclotron, an early type of particle accelerator, was built by Ernest O. Lawrence in 1931, with a radius of just a few centimetres and a particle energy of 1 megaelectronvolt (MeV). Since then, accelerators have grown enormously in the quest to produce new particles of greater and greater mass. As accelerators have grown, so too has the list of known particles that they might be used to investigate. ATLAS Collaboration The ATLAS Collaboration, the international group of physicists belonging to different universities and research centres who built and run the detector, was formed in 1992 when the proposed EAGLE (Experiment for Accurate Gamma, Lepton and Energy Measurements) and ASCOT (Apparatus with Super Conducting Toroids) collaborations merged their efforts to build a single, general-purpose particle detector for a new particle accelerator, the Large Hadron Collider. At present, the ATLAS Collaboration involves 6,003 members, out of which 3,822 are physicists (last update: June 26, 2022) from 257 institutions in 42 countries. Data-taking was then interrupted for over a year due to an LHC magnet quench incident. On 23 November 2009, the first proton–proton collisions occurred at the LHC and were recorded by ATLAS, at a relatively low injection energy of 900 GeV in the center of mass of the collision. Since then, the LHC energy has been increasing: 1.8 TeV at the end of 2009, 7 TeV for the whole of 2010 and 2011, then 8 TeV in 2012. The first data-taking period performed between 2010 and 2012 is referred to as Run I. After a long shutdown (LS1) in 2013 and 2014, in 2015 ATLAS saw 13 TeV collisions. The second data-taking period, Run II, was completed, always at 13 TeV energy, at the end of 2018 with a recorded integrated luminosity of nearly 140 fb−1 (inverse femtobarn). A second long shutdown (LS2) in 2019–22 with upgrades to the ATLAS detector was followed by Run III, which started in July 2022. Leadership , ATLAS spokesperson (2025 - Present). The ATLAS Collaboration is currently led by Spokesperson Stephane Willocq and Deputy Spokespersons Anna Sfyrla and Guillaume Unal. Former Spokespersons have been: ==Experimental program==
Experimental program
In the field of particle physics, ATLAS studies different types of processes detected or detectable in energetic collisions at the Large Hadron Collider (LHC). For the processes already known, it is a matter of measuring more and more accurately the properties of known particles or finding quantitative confirmations of the Standard Model. Processes not observed so far would allow, if detected, to discover new particles or to have confirmation of physical theories that go beyond the Standard Model. Standard Model The Standard Model of particle physics is the theory describing three of the four known fundamental forces (the electromagnetic, weak, and strong interactions, while omitting gravity) in the universe, as well as classifying all known elementary particles. It was developed in stages throughout the latter half of the 20th century, through the work of many scientists around the world, with the current formulation being finalized in the mid-1970s upon experimental confirmation of the existence of quarks. Since then, confirmation of the top quark (1995), the tau neutrino (2000), and the Higgs boson (2012) have added further credence to the Standard Model. In addition, the Standard Model has predicted various properties of weak neutral currents and the W and Z bosons with great accuracy. Although the Standard Model is believed to be theoretically self-consistent and has demonstrated huge successes in providing experimental predictions, it leaves some phenomena unexplained and falls short of being a complete theory of fundamental interactions. It does not fully explain baryon asymmetry, incorporate the full theory of gravitation as described by general relativity, or account for the accelerating expansion of the universe as possibly described by dark energy. The model does not contain any viable dark matter particle that possesses all of the required properties deduced from observational cosmology. It also does not incorporate neutrino oscillations and their non-zero masses. Precision measurements With the important exception of the Higgs boson, detected by the ATLAS and the CMS experiments in 2012, The Higgs mechanism, which includes the Higgs boson, gives mass to elementary particles, leading to differences between the weak force and electromagnetism by giving the W and Z bosons mass while leaving the photon massless. On July 4, 2012, ATLAS — together with CMS, its sister experiment at the LHC — reported evidence for the existence of a particle consistent with the Higgs boson at a confidence level of 5 sigma, In October 2013, two of the theoretical physicists who predicted the existence of the Standard Model Higgs boson, Peter Higgs and François Englert, were awarded the Nobel Prize in Physics. Top quark properties The properties of the top quark, discovered at Fermilab in 1995, had been measured approximately. With much greater energy and greater collision rates, the LHC produces a tremendous number of top quarks, allowing ATLAS to make much more precise measurements of its mass and interactions with other particles. These measurements provide indirect information on the details of the Standard Model, with the possibility of revealing inconsistencies that point to new physics. Beyond the Standard Model While the Standard Model predicts that quarks, leptons and neutrinos should exist, it does not explain why the masses of these particles are so different (they differ by orders of magnitude). Furthermore, the mass of the neutrinos should be, according to the Standard Model, exactly zero as that of the photon. Instead, neutrinos have mass. In 1998 research results at detector Super-Kamiokande determined that neutrinos can oscillate from one flavor to another, which dictates that they have a mass other than zero. For these and other reasons, many particle physicists believe it is possible that the Standard Model will break down at energies at the teraelectronvolt (TeV) scale or higher. Most alternative theories, the Grand Unified Theories (GUTs) including Supersymmetry (SUSY), predicts the existence of new particles with masses greater than those of Standard Model. Supersymmetry Most of the currently proposed theories predict new higher-mass particles, some of which may be light enough to be observed by ATLAS. Models of supersymmetry involve new, highly massive particles. In many cases these decay into high-energy quarks and stable heavy particles that are very unlikely to interact with ordinary matter. The stable particles would escape the detector, leaving as a signal one or more high-energy quark jets and a large amount of "missing" momentum. Other hypothetical massive particles, like those in the Kaluza–Klein theory, might leave a similar signature. The data collected up to the end of LHC Run II do not show evidence of supersymmetric or unexpected particles, the research of which will continue in the data that will be collected from Run III onwards. CP violation The asymmetry between the behavior of matter and antimatter, known as CP violation, is also being investigated. Microscopic black holes Some hypotheses, based on the ADD model, involve large extra dimensions and predict that micro black holes could be formed by the LHC. These would decay immediately by means of Hawking radiation, producing all particles in the Standard Model in equal numbers and leaving an unequivocal signature in the ATLAS detector. ==ATLAS detector==
ATLAS detector
The ATLAS detector is 46 metres long, 25 metres in diameter, and weighs about 7,000 tonnes; it contains some 3,000 km of cable. Detector systems :    (1) Forward regions (End-caps)    (1) Barrel region Magnet System:    (2) Toroid Magnets    (3) Solenoid Magnet Inner Detector:    (4) Transition Radiation Tracker    (5) Semi-Conductor Tracker    (6) Pixel Detector Calorimeters:    (7) Liquid Argon Calorimeter    (8) Tile Calorimeter The ATLAS detector Inner Detector s in September 2005. The Inner Detector begins a few centimetres from the proton beam axis, extends to a radius of 1.2 metres, and is 6.2 metres in length along the beam pipe. Its basic function is to track charged particles by detecting their interaction with material at discrete points, revealing detailed information about the types of particles and their momentum. The Inner Detector has three parts, which are explained below. The magnetic field surrounding the entire inner detector causes charged particles to curve; the direction of the curve reveals a particle's charge and the degree of curvature reveals its momentum. The starting points of the tracks yield useful information for identifying particles; for example, if a group of tracks seem to originate from a point other than the original proton–proton collision, this may be a sign that the particles came from the decay of a hadron with a bottom quark (see b-tagging). Pixel Detector The Pixel Detector, the innermost part of the detector, contains four concentric layers and three disks on each end-cap, with a total of 1,744 modules, each measuring 2 centimetres by 6 centimetres. The detecting material is 250 μm thick silicon. Each module contains 16 readout chips and other electronic components. The smallest unit that can be read out is a pixel (50 by 400 micrometres); there are roughly 47,000 pixels per module. The minute pixel size is designed for extremely precise tracking very close to the interaction point. In total, the Pixel Detector has over 92 million readout channels, which is about 50% of the total readout channels of the whole detector. Having such a large count created a considerable design and engineering challenge. Another challenge was the radiation to which the Pixel Detector is exposed because of its proximity to the interaction point, requiring that all components be radiation hardened in order to continue operating after significant exposures. Semi-Conductor Tracker The Semi-Conductor Tracker (SCT) is the middle component of the inner detector. It is similar in concept and function to the Pixel Detector but with long, narrow strips rather than small pixels, making coverage of a larger area practical. Each strip measures 80 micrometres by 12 centimetres. The SCT is the most critical part of the inner detector for basic tracking in the plane perpendicular to the beam, since it measures particles over a much larger area than the Pixel Detector, with more sampled points and roughly equal (albeit one-dimensional) accuracy. It is composed of four double layers of silicon strips, and has 6.3 million readout channels and a total area of 61 square meters. Transition Radiation Tracker The Transition Radiation Tracker (TRT), the outermost component of the inner detector, is a combination of a straw tracker and a transition radiation detector. The detecting elements are drift tubes (straws), each four millimetres in diameter and up to 144 centimetres long. The uncertainty of track position measurements (position resolution) is about 200 micrometres. This is not as precise as those for the other two detectors, but it was necessary to reduce the cost of covering a larger volume and to have transition radiation detection capability. Each straw is filled with gas that becomes ionized when a charged particle passes through. The straws are held at about −1,500 V, driving the negative ions to a fine wire down the centre of each straw, producing a current pulse (signal) in the wire. The wires with signals create a pattern of 'hit' straws that allow the path of the particle to be determined. Between the straws, materials with widely varying indices of refraction cause ultra-relativistic charged particles to produce transition radiation and leave much stronger signals in some straws. Xenon and argon gas is used to increase the number of straws with strong signals. Since the amount of transition radiation is greatest for highly relativistic particles (those with a speed very near the speed of light), and because particles of a particular energy have a higher speed the lighter they are, particle paths with many very strong signals can be identified as belonging to the lightest charged particles: electrons and their antiparticles, positrons. The TRT has about 298,000 straws in total. Calorimeters calorimeter, waiting to be moved inside the toroid magnets. , waiting to be inserted in late February 2006. The calorimeters Both are sampling calorimeters; that is, they absorb energy in high-density metal and periodically sample the shape of the resulting particle shower, inferring the energy of the original particle from this measurement. Electromagnetic calorimeter The electromagnetic (EM) calorimeter absorbs energy from particles that interact electromagnetically, which include charged particles and photons. It has high precision, both in the amount of energy absorbed and in the precise location of the energy deposited. The angle between the particle's trajectory and the detector's beam axis (or more precisely the pseudorapidity) and its angle within the perpendicular plane are both measured to within roughly 0.025 radians. The barrel EM calorimeter has accordion shaped electrodes and the energy-absorbing materials are lead and stainless steel, with liquid argon as the sampling material, and a cryostat is required around the EM calorimeter to keep it sufficiently cool. Hadron calorimeter The hadron calorimeter absorbs energy from particles that pass through the EM calorimeter, but do interact via the strong force; these particles are primarily hadrons. It is less precise, both in energy magnitude and in the localization (within about 0.1 radians only). This high magnetic field allows even very energetic particles to curve enough for their momentum to be determined, and its nearly uniform direction and strength allow measurements to be made very precisely. Particles with momenta below roughly 400 MeV will be curved so strongly that they will loop repeatedly in the field and most likely not be measured; however, this energy is very small compared to the several TeV of energy released in each proton collision. Toroid Magnets The outer toroidal magnetic field is produced by eight very large air-core superconducting barrel loops and two smaller end-caps air toroidal magnets, for a total of 24 barrel loops all situated outside the calorimeters and within the muon system. • LUCID (LUminosity Cherenkov Integrating Detector) is the first of these detectors designed to measure luminosity, and located in the ATLAS cavern at 17 m from the interaction point between the two muon endcaps; • ZDC (Zero Degree Calorimeter) is designed to measure neutral particles on-axis to the beam, and located at 140 m from the IP in the LHC tunnel where the two beams are split back into separate beam pipes; • AFP (Atlas Forward Proton) is designed to tag diffractive events, and located at 204 m and 217 m; • ALFA (Absolute Luminosity For ATLAS) is designed to measure elastic proton scattering located at 240 m just before the bending magnets of the LHC arc. Data systems Data generation Earlier particle detector read-out and event detection systems were based on parallel shared buses such as VMEbus or FASTBUS. Since such a bus architecture cannot keep up with the data requirements of the LHC detectors, all the ATLAS data acquisition systems rely on high-speed point-to-point links and switching networks. Even with advanced electronics for data reading and storage, the ATLAS detector generates too much raw data to read out or store everything: about 25 MB per raw event, multiplied by 40 million beam crossings per second (40 MHz) in the center of the detector. This produces a total of 1 petabyte of raw data per second. By avoiding to write empty segments of each event (zero suppression), which do not contain physical information, the average size of an event is reduced to 1.6 MB, for a total of 64 terabyte of data per second. uses fast event reconstruction to identify, in real time, the most interesting events to retain for detailed analysis. In the second data-taking period of the LHC, Run-2, there were two distinct trigger levels: • The Level 1 trigger (L1), implemented in custom hardware at the detector site. The decision to save or reject an event data is made in less than 2.5 μs. It uses reduced granularity information from the calorimeters and the muon spectrometer, and reduces the rate of events in the read-out from 40 MHz to 100 kHz. The L1 rejection factor in therefore equal to 400. • The High Level Trigger trigger (HLT), implemented in software, uses a computer battery consisting of approximately 40,000 CPUs. In order to decide which of the 100,000 events per second coming from L1 to save, specific analyses of each collision are carried out in 200 μs. The HLT uses limited regions of the detector, so-called Regions of Interest (RoI), to be reconstructed with the full detector granularity, including tracking, and allows matching of energy deposits to tracks. The HLT rejection factor is 100: after this step, the rate of events is reduced from 100 to 1 kHz. The remaining data, corresponding to about 1,000 events per second, are stored for further analyses. Analysis process ATLAS permanently records more than 10 petabytes of data per year. Offline event reconstruction is performed on all permanently stored events, turning the pattern of signals from the detector into physics objects, such as jets, photons, and leptons. Grid computing is being used extensively for event reconstruction, allowing the parallel use of university and laboratory computer networks throughout the world for the CPU-intensive task of reducing large quantities of raw data into a form suitable for physics analysis. The software for these tasks has been under development for many years, and refinements are ongoing, even after data collection has begun. Individuals and groups within the collaboration are continuously writing their own code to perform further analyses of these objects, searching the patterns of detected particles for particular physical models or hypothetical particles. This activity requires processing 25 petabytes of data every week. ==References==
tickerdossier.comtickerdossier.substack.com