The Global Community of Particle Physics
This account brings you hot items from public particle physics news sources, including CERN, SymmetryMagazine.org, and Interactions.org.
Il test della meccanica quantistica con un rivelatore a germanio dei LNGS tra le 10 notizie favorite di Science
2021-01-22T10:27:39Z via NavierStokesApp To: Public
"Il test della meccanica quantistica con un rivelatore a germanio dei LNGS tra le 10 notizie favorite di Science"
I risultati ottenuti da uno sforzo congiunto teorico e sperimentale tra i ricercatori del Centro Ricerche Enrico Fermi, dell’Istituto Nazionale di Fisica Nucleare e dell’Università di Trieste, grazie all’utilizzo di un rilevatore a germanio ultra-puro situato presso la struttura sotterranea STELLA dei Laboratori Nazionali del Gran Sasso dell'INFN, e pubblicato il 7 settembre 2020 su Nature è stato inserito dalla rivista scientifica Science tra le 10 notizie scientifiche più importanti del 2020.http://www.lngs.infn.it/en/news/rss )
Neutrino astronomy and glaciology meet at IceCube’s Polar Science Workshop
2021-01-21T23:27:42Z via NavierStokesApp To: Public
"Neutrino astronomy and glaciology meet at IceCube’s Polar Science Workshop"
Yesterday, the Second IceCube Polar Science Workshop wrapped up after two days of talks and discussion. With 135 registered participants and 25 talks, the interdisciplinary workshop brought together scientists and engineers from around the world to discuss the interplay between glaciology and neutrino astronomy.http://icecube.wisc.edu/news/feed )
Counting research rodents, a possible cause for irritable bowel syndrome, and spitting cobras
2021-01-21T19:27:43Z via NavierStokesApp To: Public
"Counting research rodents, a possible cause for irritable bowel syndrome, and spitting cobras"
Online News Editor David Grimm joins host Sarah Crespi to discuss a controversial new paper that estimates how many rodents are used in research in the United States each year. Though there is no official number, the paper suggests there might be more than 100 million rats and mice housed in research facilities in the country—doubling or even tripling some earlier estimates. Next, Staff Writer Jennifer Couzin-Frankel talks with Sarah about a new theory behind the cause of irritable bowel syndrome—that it might be a localized allergic reaction in the gut. Sarah also chats with Taline Kazandjian, a postdoctoral research associate at the Centre for Snakebite Research & Interventions in Liverpool, U.K., about how the venom from spitting cobras has evolved to cause maximum pain and why these snakes might have developed the same defense mechanism three different times. This week’s episode was produced with help from Podigy. Listen to previous podcasts. About the Science Podcast Download a transcript (PDF).http://www.sciencemag.org/rss/podcast.xml )
Week 1 at the Pole
2021-01-21T16:27:40Z via NavierStokesApp To: Public
Martinus Justinus Godefriedus Veltman (1931 – 2021)
2021-01-21T09:28:16Z via NavierStokesApp To: Public
"Martinus Justinus Godefriedus Veltman (1931 – 2021)"
Martinus Justinus Godefriedus Veltman (1931 – 2021) anschaef Wed, 01/20/2021 - 10:51http://home.web.cern.ch/about/updates/feed )
Martinus (“Tini”) Veltman started his scientiﬁc career relatively late: he obtained his PhD from the University of Utrecht in 1963(1) under the supervision of Leon Van Hove, but had already moved to CERN in 1961, where Van Hove, had been named Leader of the CERN Theory Division. At CERN, Van Hove studied mainly hadronic physics but Veltman became interested in weak interactions(2) and current algebras. It is in these ﬁelds that he made his most important and lasting contributions.
In around 1966, he was trying to understand the deeper origin of the conservation, or near conservation, of the weak currents. In particular, he tried to throw some light on the general confusion that prevailed at the time concerning the so-called “Schwinger terms” in the commutators of two current components. While on a visit from CERN to Brookhaven, he wrote a paper in which he suggested a set of divergence equations which generalised the notion of the covariant derivative of quantum electrodynamics. This fundamental idea was taken up the following year and developed further by John Stewart Bell. At that time, people had postulated the existence of a pair of charged, massive vector W± bosons as intermediaries of the weak interactions so, motivated by these divergence equations, Veltman decided to study their ﬁeld theory properties. The electrodynamics of such charged bosons had been formulated already by T.D. Lee and C.N. Yang in 1962. They had shown that electromagnetic gauge invariance allows expression of the vector boson’s charge e, magnetic moment µ and quadrupole moment Q in terms of only two parameters, e and κ, as µ = e(1 + κ)/2mW and Q = −eκ/mW2. The resulting theory is highly divergent but Veltman noticed that many divergences cancel for the value κ = 1. This is the value predicted by a theory in which W± and the photon form a Yang-Mills triplet. For Veltman, this was a clear signal that the theory of weak and electromagnetic interactions must obey a Yang-Mills gauge invariance.
The study of a massive Yang-Mills ﬁeld theory turned out to be very complicated, both conceptually, since the correct Feynman rules were not known and practically, because the number of terms grew very fast. Veltman had to develop a computer program to handle them. He called it “Schoonschip” (“Clean ship” in Dutch) and it was the ﬁrst program of symbolic manpulations applied to theoretical high-energy physics. Schoonschip opened the way to the modern computer codes used to manipulate Feynman diagrams, which are responsible for the enormous progress made with the sophisticated calculations of the Standard Model processes that have been produced in the last decades.
The experience Veltman had acquired in his thesis, working with diagrams in which the particles in the intermediate lines were on their mass shells, the so-called “cutting rules”, was precious. He spent 1968 at Orsay near Paris, where he lectured on Yang-Mills and path integrals and, in 1969, he was joined in Utrecht by Gerard ’t Hooft, a graduate student with whom he shared the 1999 Nobel Prize. Their work was a real “tour de force”. They invented and developed many techniques that became standards in particle physics. The citation of the Nobel Prize reads “… for elucidating the quantum structure of electroweak interactions in physics”. The importance of this work cannot be overestimated. Although the citation refers to the electroweak interactions, their result made possible the subsequent discovery of QCD. Since that time, gauge theories have become the universal language of fundamental physics.
Veltman and ’t Hooft gave the ﬁrst detailed presentation of their results at a small meeting in Orsay in 1971. This meeting was remarkable in many respects. Firstly, it oﬀered the ﬁrst complete picture of the renormalisation properties of Yang-Mills theories. Secondly, it triggered stimulating discussions among the participants, in particular regarding the vital importance of the axial current anomaly cancellation.
With the rise of the Standard Model, a long series of meetings was launched, which became known as “triangular meetings” (Paris-Rome-Utrecht). Subsequently extended to other European centres, the triangular meetings played an important role in the development of new ideas in ﬁeld theory and in establishing a European network in theoretical physics. Veltman was a central ﬁgure in those meetings.
After the discovery of the Intermediate Vector Bosons, several groups embarked on a systematic study of the higher order electroweak corrections to the predictions of the Standard Model. The group led by Veltman was among the most active. A particular focus of attention was the so-called ρ = MW / cos θMZ parameter. Veltman observed that its deviation from unity, the value predicted to lowest order in the Glashow-Weinberg-Salam theory, depends quadratically on the top quark mass and logarithmically on the Brout-Englert-Higgs boson mass. Precise determinations of the ρ parameter led eventually to a prediction of the top quark mass, conﬁrmed by the top quark discovery of CDH at Fermilab. Even more precise values of MW and Mtop led to signiﬁcant limitations to the BEH boson mass, in agreement with the mass of the scalar particle discovered by ATLAS and CMS in 2012.
Veltman stayed in Utrecht until 1981. He attracted many talented young students and established a very active school of theoretical high-energy physics in the Netherlands. He was a lifelong supporter of CERN and an elected member of the CERN Scientiﬁc Policy Committee from 1976 to 1982. In recent years, we often saw him at the SPC annual meetings to which former members are invited, and enjoyed the acute and humorous remarks he used to include in his interventions.
John Iliopoulos and Luciano Maiani
(1) His ﬁrst publication is entitled “Unitarity and causality in a renormalizable ﬁeld theory with unstable particles”, Physica, Vol. 29, 186 (1963).
(2) He even joined Bernardini’s neutrino experiment for a while.
Xcitement down under: Australia gets first X-band facility
2021-01-15T11:29:24Z via NavierStokesApp To: Public
"Xcitement down under: Australia gets first X-band facility"
Xcitement down under: Australia gets first X-band facilityachintya Fri, 01/15/2021 - 10:22http://home.web.cern.ch/about/updates/feed )
On 16 September 2020, a container filled with pallets, boxes and electronic racks left CERN’s Meyrin site to embark on a two-month sea journey to the other side of the world. On 17 November, at precisely 3.12 p.m. local time, its ship docked at Port Melbourne, from where, following customs clearance earlier this year, the container and its contents were transported to a new home: the University of Melbourne.
The container held the components of the southern hemisphere’s first X-band radio-frequency test facility; “X-band” refers to the ultra-high-frequency at which the device operates. The device, half of the CERN facility known as XBOX-3, will soon be a part of the “X-Lab” at the University of Melbourne. Its journey resulted from an agreement signed between CERN and the Australian Collaboration for Accelerator Science in 2010.
XBOX-3 and its two predecessors were built at CERN in the context of the Compact Linear Collider (CLIC) study that envisions building a linear electron–positron collider with a collision energy of 380 GeV. They were built to develop the technology to accelerate particles to a high velocity over a relatively small distance. Such accelerators are described as possessing a high acceleration gradient. In addition to aiding the development of the next generation of particle accelerators, the technology of high-gradient acceleration is also useful for medical applications, such as radiotherapy, and in synchrotron light sources.
In 2015, CERN decided that half of XBOX-3 would eventually be sent to Australia to help its nascent accelerator community. “Having the only X-band facility this side of the equator is a huge boost to the growing accelerator-physics community in Australia. It will allow us to train specialists, do novel research and create exciting industry-engagement opportunities based on the many applications of accelerators,” says Suzie Sheehy, group leader of the Accelerator Physics Group at the University of Melbourne. “The Melbourne X-Lab team, which includes senior researchers, PhD students and support staff, is grateful for CERN’s contribution to our project.”
The device will be renamed MelBOX, in light of its new home, and will come online in its new avatar this year.
See more photos on CDS:
An elegy for Arecibo, and how our environments change our behavior
2021-01-14T19:29:22Z via NavierStokesApp To: Public
"An elegy for Arecibo, and how our environments change our behavior"
Science Senior Correspondent Daniel Clery regales host Sarah Crespi with tales about the most important work to come from 57 years of research at the now-defunct Arecibo Observatory and plans for the future of the site. Sarah also talks with Toman Barsbai, an associate professor in the school of economics at the University of Bristol, about the influence of ecology on human behavior—can we figure out how many of our behaviors are related to the different environments where we live? Barsbai and colleagues took on this question by comparing behaviors around finding food, reproduction, and social hierarchy in three groups of animals living in the same places: foraging humans, nonhuman mammals, and birds. This week’s episode was produced with help from Podigy. Listen to previous podcasts. About the Science Podcast Download a transcript (PDF).http://www.sciencemag.org/rss/podcast.xml )
Stephen Sekula shared this.
ATLAS releases ‘full orchestra’ of analysis instruments
2021-01-14T18:29:14Z via NavierStokesApp To: Public
"ATLAS releases ‘full orchestra’ of analysis instruments"
The ATLAS collaboration has begun to publish likelihood functions, information that will allow researchers to better understand and use their experiment’s data in future analyses.
Meyrin, Switzerland, sits serenely near the Swiss-French border, surrounded by green fields and the beautiful Rhône river. But a hundred meters beneath the surface, protons traveling at nearly the speed of light collide and create spectacular displays of subatomic fireworks inside the experimental detectors of the Large Hadron Collider at CERN, the European particle physics laboratory.
One detector, called ATLAS, is five stories tall and has the largest volume of any particle detector in the world. It captures the trajectory of particles from collisions that happen a billion times a second and measures their energy and momentum. Those collisions produce incredible amounts of data for researchers to scour, searching for evidence of new physics. For decades, scientists at ATLAS have been optimizing ways to archive their analysis of that data so these rich datasets can be reused and reinterpreted.
Twenty years ago, during a panel discussion at CERN’s First Workshop on Confidence Limits, participants unanimously agreed to start publishing likelihood functions with their experimental results. These functions are essential to particle physics research because they encode all the information physicists need to statistically analyze their data through the lens of a particular hypothesis. This includes allowing them to distinguish signal (interesting events that may be clues to new physics) from background (everything else) and to quantify the significance of a result.
As it turns out, though, getting a room full of particle physicists to agree to publish this information was the easiest part.
In fact, it was not until 2020 that ATLAS researchers actually started publishing likelihood functions along with their experimental results. These “open likelihoods” are freely available on the open-access site HEPData as part of a push to make LHC results more transparent and available to the wider community.
“One of my goals in physics is to try and make it more accessible,” says Giordon Stark, a postdoctoral researcher at the University of California, Santa Cruz, who is on the development team for the open-source software used to publish the likelihood functions.
The US Department of Energy's Office of Science and the National Science Foundation support US involvement in the ATLAS experiment.
Stark says releasing the full likelihoods is a good step toward his goal.
The problem with randomness
Why are likelihoods so essential? Because particle collision experiments are inherently random. Unlike in a deterministic experiment, where a researcher does “x” and expects “y” to happen, in a random experiment (like throwing dice or colliding beams of protons), a researcher can do “x” the same way every time but can only predict the random outcome probabilistically.
Because of the inherent randomness of particle interactions in the ATLAS detector, physicists need to construct what is called a “probability model” to mathematically describe the experiment and form meaningful conclusions about how the resulting data relate to a theory.
The probability model is a mathematical representation of all the possible outcomes. It’s represented by the expression p(x|θ): the probability “p” of obtaining data “x,” given the parameters “θ.”
The data are observations from the ATLAS detector, while the parameters are everything influencing the system, from the laws of physics to the calibration constants of the detector. A few of these parameters are central to a physicist’s model (they’re called “parameters of interest”—things like the mass of the Higgs boson), but hundreds of other “nuisance parameters” (things like detector responses, calibration constants and the behavior of the particles themselves) also need to be taken into account.
When experimentally observed data are plugged into the probability model, they return a likelihood function, which determines the values of the model’s parameters that best describe the observed data.
Importantly, the process answers the question of how likely it would be for a physicist’s theory to have produced the data they observe.
A new tool comes to the rescue
When you consider the hundreds of parameters in an ATLAS analysis, each with their respective uncertainties, along with the layers of functions relating the parameters to each other, calculating the likelihoods gets pretty complicated—and so does presenting them. While likelihoods for one or two parameters can be plotted on a graph, this clearly isn’t possible when there are hundreds of them—making the question of how to publish the likelihoods much more challenging than whether this should be done.
In 2011, ATLAS researchers Kyle Cranmer, Wouter Verkerke and their team released two tools to help with this. One, called the RooFit Workspace, allowed researchers to save their likelihoods in a digital file. The other, called HistFactory, made it easier for users to construct a likelihood function for their theory. Since then, the HistFactory concept has evolved into an open-source software package, spearheaded by Stark and fellow physicists Matthew Feickert and Lukas Heinrich, called pyhf [pronounced in three syllables: py h f].
Cranmer says it’s important to understand that pyhf isn’t some magical black box where you put data in and get a likelihood out. Researchers need to make lots of decisions in the process, and “every little bit of that likelihood function should be tied to part of the justification that you have for it and the story that you’re telling as a scientist,” he says.
After interpreting these decisions, pyhf exports the probability model in a plain-text, easy-to-read format called JSON that can be read across a range of platforms, making it easier for other researchers to access the likelihood function and see how the analysis was done.
“The important part is that it’s readable,” says Cranmer. “The stuff you’re reading is not some random, technical gobbledygook. It’s tied to how a physicist thinks about the analysis.”
Making old data do new work
Before the RooFit Workspace came along, the thousands of researchers involved in the ATLAS collaboration had no standardized way to format and store data likelihood functions. Much of the meticulous data analysis was done by PhD students who eventually graduated and left for new positions, taking their intimate familiarity with likelihood construction along with them.
Without the full likelihood function, it’s impossible to reproduce or reinterpret ATLAS data from published results without having to make possibly crude approximations. But with the layers of rich metadata embedded in the pyhf likelihoods, including background estimates, systematic uncertainty and observed data counts from the detector, scientists have everything they need to mathematically reconstruct the analysis. This allows them to reproduce and reinterpret previously published results without repeating the time-consuming and expensive process of analyzing the data from scratch.
Public likelihoods also provide fantastic opportunities for reinterpretation by theorists, says Sabine Kraml, a theoretical physicist at the Laboratory of Subatomic Physics and Cosmology in Grenoble, France, who has been involved with helping establish how LHC data, including the likelihoods, should be presented.
With full likelihood functions, theorists can calculate how well their theories fit the data collected by the detector “at a completely different level of reliability and precision," says Kraml.
To understand just how much more sophisticated and complex the analysis becomes, she says, consider the difference between a simple song and a full orchestral symphony.
Although this precise model-fitting is limited to theories that share the same statistical model as the one originally tested by the experiment—“It’s a restricted playground,” Cranmer says—there is a work-around. Full likelihoods can be put through an additional round of processing called recasting, using a service Cranmer proposed called RECAST, which generates a new likelihood function in the context of a physicist’s theory. Armed with this new likelihood, scientists can test their theories against existing ATLAS data, searching for new physics in old datasets.
So far, two ATLAS searches have been repurposed using RECAST. One used a dark-matter search to study a Higgs boson decaying to bottom quarks. The other used a search for displaced hadronic jets to look at three new physics models.
Cranmer says he hopes the ATLAS experimental community will continue to publish their likelihoods and take advantage of RECAST so the wider scientific community can test more and more theories.
Stephen Sekula likes this.
Stephen Sekula shared this.
Dark Energy Survey makes public catalog of nearly 700 million astronomical objects
2021-01-14T16:28:55Z via NavierStokesApp To: Public
"Dark Energy Survey makes public catalog of nearly 700 million astronomical objects"
Dark Energy Survey makes public catalog of nearly 700 million astronomical objectsPress Releasexeno Thu, 01/14/2021 - 08:17221
Editor’s note: The DES second data release will be featured at the meeting of the American Astronomical Society. The session “NOIRLab’s Data Services: A Practical Demo Built on Science with DES DR2” takes place on Thursday, Jan. 14, 3:10-4:40 p.m. Central time. The session “Dark Energy Survey: New Results and Public Data Release 2” takes place on Friday, Jan. 15, 11 a.m.-12:30 p.m. Central time.
The Dark Energy Survey, a global collaboration including the Department of Energy’s Fermi National Accelerator Laboratory, the National Center for Supercomputing Applications, and the National Science Foundation’s NOIRLab, has released DR2, the second data release in the survey’s seven-year history. DR2 is the topic of sessions today and tomorrow at the 237th Meeting of the American Astronomical Society, which is being held virtually.
The second data release from the Dark Energy Survey, or DES, is the culmination of over a half-decade of astronomical data collection and analysis with the ultimate goal of understanding the accelerating expansion of the universe and the phenomenon of dark energy, which is thought to be responsible for this accelerated expansion. It is one of the largest astronomical catalogs released to date.
Including a catalog of nearly 700 million astronomical objects, DR2 builds on the 400 million objects cataloged with the survey’s prior data release, or DR1, and also improves on it by refining calibration techniques, which, with the deeper combined images of DR2, lead to improved estimates of the amount and distribution of matter in the universe.
Astronomical researchers around the world can access these unprecedented data and mine them to make new discoveries about the universe, complementary to the studies being carried out by the Dark Energy Survey collaboration. The full data release is online and available to the public to explore and gain their own insights as well.
Shown here is the elliptical galaxy NGC 474 with star shells. Elliptical galaxies are characterized by their relatively smooth appearance as compared with spiral galaxies, one of which is to the left of NGC 474, which is oriented with South to the top and West to the left. The colorful neighboring spiral (NGC 470) has characteristic flocculent structure interwoven with dust lanes and spiral arms. NGC 474 is at a distance of about 31 megaparsecs (100 million light-years) from the sun in the constellation of Pisces. The region surrounding NGC 474 shows unusual structures characterized as ‘tidal tails’ or ‘shells of stars’ made up of hundreds of millions of stars. These features are likely due to recent (within the last billion years) mergers of smaller galaxies into the main body of NGC 474 or close passages of nearby galaxies, such as the NGC 470 spiral. For coordinate information, visit the NOIRLab webpage for this photo. Photo: DES/NOIRLab/NSF/AURA. Acknowledgments: Image processing: DES, Jen Miller (Gemini Observatory/NSF’s NOIRLab), Travis Rector (University of Alaska Anchorage), Mahdi Zamani & Davide de Martin. Image curation: Erin Sheldon, Brookhaven National Laboratory
DES was designed to map hundreds of millions of galaxies and to discover thousands of supernovae in order to measure the history of cosmic expansion and the growth of large-scale structure in the universe, both of which reflect the nature and amount of dark energy in the universe. DES has produced the largest and most accurate dark matter map from galaxy weak lensing to date, as well as a new map, three times larger, that will be released in the near future.
One early result relates to the construction of a catalog of a type of pulsating star known as “RR Lyrae,” which tells scientists about the region of outer space beyond the edge of our Milky Way. In this area nearly devoid of stars, the motion of the RR Lyrae hints at the presence of an enormous “halo” of invisible dark matter, which may provide clues on how our galaxy was assembled over the last 12 billion years. In another result, DES scientists used the extensive DR2 galaxy catalog, along with data from the LIGO experiment, to estimate the location of a black hole merger and, independent of other techniques, infer the value of the Hubble constant, a key cosmological parameter. Combining their data with other surveys, DES scientists have also been able to generate a complete map of the Milky Way’s dwarf satellites, giving researchers insight into how our own galaxy was assembled and how it compares with cosmologists’ predictions.
Covering 5,000 square degrees of the southern sky (one-eighth of the entire sky) and spanning billions of light-years, the survey data enables many other investigations in addition to those targeting dark energy, covering a vast range of cosmic distances — from discovering new nearby solar system objects to investigating the nature of the first star-forming galaxies in the early universe.
“This is a momentous milestone. For six years, the Dark Energy Survey collaboration took pictures of distant celestial objects in the night sky. Now, after carefully checking the quality and calibration of the images captured by the Dark Energy Camera, we are releasing this second batch of data to the public,” said DES Director Rich Kron of Fermilab and the University of Chicago. “We invite professional and amateur scientists alike to dig into what we consider a rich mine of gems waiting to be discovered.”
This irregular dwarf galaxy, named IC 1613, contains some 100 million stars (bluish in this portrayal). It is a member of our Local Group of galaxy neighbors, a collection which also includes our Milky Way, the Andromeda spiral and the Magellanic clouds. 2.4 million light-years away, it contains several examples of Cepheid variable stars — key calibrators of the cosmic distance ladder. The bulk of its stars were formed about 7 billion years ago, and it does not appear to be undergoing star formation at the present day, unlike other very active dwarf irregulars such as the Large and Small Magellanic clouds. To the lower right of IC 1613 (oriented with North to the left and East down in this view), one may view a background galaxy cluster (several hundred times more distant than IC 1613) consisting of dozens of orange-yellow blobs, centered on a pair of giant cluster elliptical galaxies. To the left of the irregular galaxy is a bright, sixth magnitude, foreground Milky Way star in the constellation of Cetus the Whale, identified here as a star by its sharp diffraction spikes radiating at 45 degree angles. For coordinate information, visit the NOIRLab webpage for this photo. Photo: DES/NOIRLab/NSF/AURA. Acknowledgments: Image processing: DES, Jen Miller (Gemini Observatory/NSF’s NOIRLab), Travis Rector (University of Alaska Anchorage), Mahdi Zamani & Davide de Martin
The primary tool in collecting these images, the DOE-built Dark Energy Camera, is mounted to the NSF-funded Víctor M. Blanco 4-meter Telescope, part of the Cerro Tololo Inter-American Observatory in the Chilean Andes, part of NSF’s NOIRLab. Each week, the survey collected thousands of pictures of the southern sky, unlocking a trove of potential cosmological insights.
Once captured, these images (and the large amount of data surrounding them) are transferred to the National Center for Supercomputing Applications for processing via the DES Data Management project. Using the Blue Waters supercomputer at NCSA, the Illinois Campus Cluster and computing systems at Fermilab, NCSA prepares calibrated data products for public and research consumption. It takes approximately four months to process one year’s worth of data into a searchable, usable catalog.
The detailed precision cosmology constraints based on the full six-year DES data set will come out over the next two years.
The DES DR2 is hosted at the Community Science and Data Center, a program of NOIRLab. CSDC provides software systems, user services and development initiatives to connect and support the scientific missions of NOIRLab’s telescopes, including the Blanco Telescope at Cerro Tololo Inter-American Observatory.
NCSA, NOIRLab and the LIneA Science Server collectively provide the tools and interfaces that enable access to DR2.
The Dark Energy Survey uses a 570-megapixel camera mounted on the Blanco Telescope, at the CTI Observatory in Chile, to image 5,000 square degrees of southern sky. Photo: Fermilab
This work is supported in part by the U.S. Department of Energy Office of Science.
The Dark Energy Survey is a collaboration of more than 400 scientists from 26 institutions in seven countries. Funding for the DES Projects has been provided by the U.S. Department of Energy, the U.S. National Science Foundation, the Ministry of Science and Education of Spain, the Science and Technology Facilities Council of the United Kingdom, the Higher Education Funding Council for England, the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, the Kavli Institute of Cosmological Physics at the University of Chicago, Funding Authority for Studies and Projects in Brazil, Carlos Chagas Filho Foundation for Research Support of the State of Rio de Janeiro, Brazilian National Council for Scientific and Technological Development and the Ministry of Science, Technology and Innovation, the German Research Foundation and the collaborating institutions in the Dark Energy Survey, the list of which can be found at www.darkenergysurvey.org/collaboration.
About NSF’s NOIRLab
NSF’s NOIRLab (National Optical-Infrared Astronomy Research Laboratory), the US center for ground-based optical-infrared astronomy, operates the international Gemini Observatory (a facility of NSF, NRC–Canada, ANID–Chile, MCTIC–Brazil, MINCyT–Argentina and KASI–Republic of Korea), Kitt Peak National Observatory (KPNO), Cerro Tololo Inter-American Observatory (CTIO), the Community Science and Data Center (CSDC), and Vera C. Rubin Observatory. It is managed by the Association of Universities for Research in Astronomy (AURA) under a cooperative agreement with NSF and is headquartered in Tucson, Arizona. The astronomical community is honored to have the opportunity to conduct astronomical research on Iolkam Du’ag (Kitt Peak) in Arizona, on Maunakea in Hawaiʻi, and on Cerro Tololo and Cerro Pachón in Chile. We recognize and acknowledge the very significant cultural role and reverence that these sites have to the Tohono O’odham Nation, to the Native Hawaiian community, and to the local communities in Chile, respectively.
NCSA at the University of Illinois at Urbana-Champaign provides supercomputing and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students, and collaborators from around the globe use advanced digital resources to address research grand challenges for the benefit of science and society. NCSA has been advancing one third of the Fortune 50® for more than 30 years by bringing industry, researchers, and students together to solve grand challenges at rapid speed and scale. For more information, please visit www.ncsa.illinois.edu.
Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance LLC, a joint partnership between the University of Chicago and the Universities Research Association, Inc. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @Fermilab.
The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.http://www.interactions.org/index.rss )
Fermi National Accelerator Laboratory
Fermilab is America's particle physics and accelerator laboratory. Founded in 1967, Fermilab drives discovery by investigating the smallest building blocks of matter using world-leading particle accelerator and detector facilities. We also use the universe as a laboratory, making measurements of the cosmos to the mysteries of dark matter and dark energy. Fermilab is located near Chicago, Illinois, and is managed by Fermi Research Alliance, LLC for the U.S. Department of Energy Office of Science.research universities.
What are we made of? How did the universe begin? What secrets do the smallest, most elemental particles of matter hold, and how can they help us understand the intricacies of space and time?
Since 1967, Fermilab has worked to answer these and other fundamental questions and enhance our understanding of everything we see around us. As the United States' premier particle physics laboratory, we do science that matters. We work together with our international partners on the world's most advanced particle accelerators and dig down to the smallest building blocks of matter. We also probe the farthest reaches of the universe, seeking out the nature of dark matter and dark energy.
Fermilab's 6,800-acre site is located in Batavia, Illinois, and is managed by the Fermi Research Alliance LLC for the U.S. Department of Energy Office of Science. FRA is a partnership of the University of Chicago and Universities Research Association Inc., a consortium of 89
P.O. Box 500
+ 1 630 840 3000,
+ 1 630 840 4343 (fax)http://www.fnal.gov/
A new way to look for gravitational waves
2021-01-14T16:28:55Z via NavierStokesApp To: Public
"A new way to look for gravitational waves"
A new way to look for gravitational waveskatebrad Wed, 01/13/2021 - 08:57http://home.web.cern.ch/about/updates/feed )
In a paper published today in Physical Review Letters, Valerie Domcke of CERN and Camilo Garcia-Cely of DESY report on a new technique to search for gravitational waves – the ripples in the fabric of spacetime that were first detected by the LIGO and Virgo collaborations in 2015 and earned Rainer Weiss, Barry Barish and Kip Thorne the Nobel Prize in Physics in 2017.
Domcke and Garcia-Cely’s technique is based on the conversion of gravitational waves of high frequency (ranging from megahertz to gigahertz) into radio waves. This conversion takes place in the presence of magnetic fields and distorts the relic radiation from the early universe known as cosmic microwave background, which permeates the universe.
The research duo shows that this distortion, deduced from cosmic microwave background data obtained with radio telescopes, can be used to search for high-frequency gravitational waves generated by cosmic sources such as sources from the dark ages or even further back in our cosmic history. The dark ages are the period between the time when hydrogen atoms formed and the moment when the first stars lit up the cosmos.
“The odds that these high-frequency gravitational waves convert into radio waves are tiny, but we counterbalance these odds by using an enormous detector, the cosmos,” explains Domcke. “The cosmic microwave background provides an upper bound on the amplitude of the high-frequency gravitational waves that convert into radio waves. These high-frequency waves are beyond the reach of the laser interferometers LIGO, Virgo and KAGRA.”
Domcke and Garcia-Cely derived two such upper bounds, using cosmic microwave background measurements from two radio telescopes: the balloon-borne ARCADE 2 instrument and the EDGES telescope located at the Murchison Radio-Astronomy Observatory in Western Australia. The researchers found that, for the weakest possible cosmic magnetic ﬁelds, determined from current astronomical data, the EDGES measurements result in a maximum amplitude of one part in 1012 for a gravitational wave with a frequency of around 78 MHz, whereas the ARCADE 2 measurements yield a maximum amplitude of one part in 1014 at a frequency of 3−30 GHz. For the strongest possible cosmic magnetic fields, these bounds are tighter – one part in 1021 (EDGES) and one part in 1024 (ARCADE 2) – and are about seven orders of magnitude more stringent than current bounds derived from existing laboratory-based experiments.
Domcke and Garcia-Cely say that data from next-generation radio telescopes such as the Square Kilometre Array, as well as improved data analysis, should tighten these bounds further and could perhaps even detect gravitational waves from the dark ages and earlier cosmic times.
Building a Giant 2D Map of the Universe to Prepare for the Largest 3D Map
2021-01-13T22:28:26Z via NavierStokesApp To: Public
"Building a Giant 2D Map of the Universe to Prepare for the Largest 3D Map"
Building a Giant 2D Map of the Universe to Prepare for the Largest 3D MapPress Releasexeno Wed, 01/13/2021 - 15:27121
Nearly 200 researchers pitched in to gather, process, and stitch together images for half of the sky to prepare for the start of the Dark Energy Spectroscopic Instrument’s observations
This video describes the monumental effort that went into constructing a 2D map of the universe to prepare for the Dark Energy Spectroscopic Instrument, which will produce the largest-ever 3D map of the universe. The final data release for the preparation of this 2D map, known as Data Release 9 or DR9, is scheduled to be distributed Jan. 13. (Credit: Marilyn Sargent/Lawrence Berkeley National Laboratory)
Before DESI, the Dark Energy Spectroscopic Instrument, can begin its 5-year mission from an Arizona mountaintop to produce the largest 3D sky map yet, researchers first needed an even bigger 2D map of the universe.
The 2D map, pieced together from 200,000 telescope images and several years of satellite data, lacks information about galaxy distances, and DESI will supply this and provide other useful details by measuring the color signatures and “redshift” of galaxies and quasars in its survey. Objects’ redder colors provide telltale information about their distance from Earth and about how quickly they are moving away from us – and this phenomenon is known as redshift.
In the end, this 2D map of the universe is the largest ever, based on the area of sky covered, its depth in imaging faint objects, and its more than 1 billion galaxy images.
The ambitious, 6-year effort to capture images and stitch them together for this 2D map – which involved 1,405 observing nights at three telescopes on two continents and years of data from a space satellite, an upgraded camera to image incredibly faint and distant galaxies, 150 observers and 50 other researchers from around the world. The effort also required 1 petabyte of data – enough to store 1 million movies – and 100 million CPU hours at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC).
2D map sets the stage for DESI observations, with a goal to solve dark energy mystery
“This is the biggest map by almost any measure,” said David Schlegel, co-project scientist for DESI who led the imaging project, known as the DESI Legacy Imaging Surveys. Schlegel is a cosmologist at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), which is the lead institution for the international DESI collaboration.
The map covers half of the sky, and digitally sprawls over 10 trillion pixels, which is equivalent to a mosaic of 833,000 high-res smartphone photos. The DESI collaboration has about 600 participating scientists from 54 institutions around the world.
Publicly viewable at legacysurvey.org/viewer, the Sky Viewer map includes 2 billion objects – more than half of which are galaxies – and numerous clickable filters to select from specific object types or surveys. Some of the objects are individually labeled, and viewers can choose to display constellations, for example, and galaxies and quasars that will be imaged by DESI. Quasars are among the brightest objects in the universe, with supermassive black holes at their center that emit powerful jets of matter.
DESI is equipped with an array of 5,000 swiveling, automated robots, each toting a thin fiber-optic cable that will be pointed at individual objects. These cables will gather the light from 35 million galaxies and 2.4 million quasars during the five years of DESI observations.
DESI will collect and transmit data from these measurements to Berkeley Lab’s NERSC from Kitt Peak. Researchers at NERSC have already prepared for this incoming data by identifying which data-processing tasks would take up the most computing time and improving the code to speed up these tasks on the center’s current- and next-generation supercomputers. “In the end, we increased processing throughput five to seven times, which was a big accomplishment – bigger than I expected,” said Laurie Stephey, a data analytics engineer at NERSC who played a key role in the effort.
The primary purpose of compiling the 2D map data is to identify these galaxy and quasar targets for DESI, which will measure their light to pinpoint their redshift and distance. This will ultimately provide new details about mysterious dark energy that is driving the universe’s accelerating expansion.
Nathalie Palanque-Delabrouille, DESI co-spokesperson and a cosmologist at the French Alternative Energies and Atomic Energy Commission (CEA), noted that the expansion rate has evolved, and there are many unanswered questions about the changes in this rate.
“Our universe had a surprising history,” she explained. “During the first half of its life, its expansion was driven mostly by the dark matter it contains.” Dark matter is unknown matter, making up 85 percent of all matter in the universe and so far only observed indirectly through its gravitational effects on normal matter.
“However, in the past 7 billion years the expansion of our universe has been gradually accelerating under the influence of a mysterious dark energy,” she added, “and the goal of DESI is to precisely clarify this overall picture by unveiling what dark energy is.”
Palanque-Delabrouille has been involved in the effort to pick targets for DESI to observe from the surveys’ data. She noted that DESI will gather light from a mix of galaxies at several distances, including bright galaxies that are within 4 billion light years of Earth, so-called red galaxies that allow us to see back to 8 billion years ago, very young blue galaxies or “emission-line” galaxies that will go further back 10 billion years ago, and ultimately quasars, which are so bright they can be seen up to 12 billion light-years away.
“Having managed to collect and process these imaging data is really a major achievement. DESI wouldn’t be getting anywhere without such large imaging surveys,” she said.
Software guides observing plan, and standardizes and stitches imaging data
Piecing together all of the DESI surveys’ images to create a seamless sky map was no trivial task, Schlegel explained. “One of the goals is to get a really uniform image by stitching together multiple observations,” he said. “We started out scattershot. And cameras do have gaps – they miss stuff. Part of the challenge here was planning the observing program so that you could fill in all of the gaps – that was a huge logistical challenge. You have to make sure it is as homogeneous as possible.”
The three surveys that comprise the DESI Legacy Imaging Surveys conducted imaging in three different colors, and each survey took three separate images of the same sky areas to ensure complete coverage. This new, ground-based imaging data was also supplemented by imaging data from NASA’s Wide-field Infrared Survey Explorer (WISE) satellite mission, which collected space images in four bands of infrared light.
For the Legacy Imaging Surveys’ data-taking effort, Schlegel designed a code, improved over time, that helped to calculate the best approach and timing for capturing the best images to completely cover half of the sky, considering hours of darkness, weather, exposure time, planetary and satellite paths, and moon brightness and location, among other variables.
Dustin Lang, DESI imaging scientist at the Perimeter Institute for Theoretical Physics in Canada, played a key role in standardizing all of the imaging data from ground- and sky-based surveys and stitching it together.
In some images, Lang noted, “the sky might be really stable and calm,” while on another night “we might have light clouds or just a turbulent atmosphere that causes blurring in the images.” His challenge: to develop software that recognized the good data without diluting it with the bad data. “What we wanted to think about is what the stars and galaxies looked like above the atmosphere,” he said, and to make sure the images matched up even when they were taken under different conditions.
Lang created “The Tractor,” a so-called “inference-based” model of the sky, to compare with data for the shape and brightness of objects imaged by different surveys and to select the best fit. The Tractor drew heavily upon supercomputer resources at Berkeley Lab’s NERSC to process the Legacy Imaging Surveys’ data and ensure its quality and consistency.
It was Lang, too, who recognized the potential popularity of the viewing tool created for the imaging data – which was adapted from street-mapping software – and brought it to the public as the Sky Viewer interactive map.
The viewing tool, he noted, was originally used by DESI researchers to check data discrepancies in the surveys’ images. It “transformed the way our team interacted with the data. It suddenly felt a lot more real that we could just scroll around the sky and explore individual problems with our data. It turned out to be surprisingly powerful.”
Imaging data from 3 surveys seeds other science research
Arjun Dey, the DESI project scientist for the National Science Foundation’s NOIRLab, which includes the Kitt Peak National Observatory site where DESI is situated, was a major contributor to two of the three imaging surveys, serving as the lead scientist for the Mayall z-band Legacy Survey (MzLS) carried out at Kitt Peak, and as co-lead scientist with Schlegel for the Dark Energy Camera Legacy Survey (DECaLS) carried out at a NOIRLab site in Chile.
The third DESI-preparatory survey, known as the Beijing-Arizona Sky Survey or (BASS), was conducted at Kitt Peak and supported by an international collaboration including the Chinese Academy of Sciences and the University of Arizona.
Researchers from China made more than 90 trips to Kitt Peak to carry out observations for BASS, which was supported by an international collaboration including the National Astronomical Observatories of China (NAOC) and the University of Arizona. “A joint research team of more than 40 people from 11 institutes in China and the U.S. participated in BASS and contributed to the success of this data release,” said Hu Zou, an astrophysicist at the Key Laboratory of Optical Astronomy in Beijing and a co-lead investigator for BASS. “This team will also play an important role in the future of the DESI survey and related sciences,” he added.
The MzLS survey, meanwhile, featured a rebuilt camera designed to see the infrared light emitted by distant, faint galaxies. Equipped with four large, ultrasensitive light-capturing sensors, called CCDs, the MzLS survey camera produced images of galaxies 10 times fainter than those sampled in a previous survey. DESI itself is outfitted with very similar CCDs that enable it to capture light from objects up to 12 billion light-years away, and both sets of CCDs were developed at Berkeley Lab.
The collective effort of the three surveys, Dey said, “was one of the most uniform, deep surveys of the sky that has ever been undertaken. It was really exciting to participate.”
All of the raw data from the imaging surveys has been released to the scientific community and public. This final data release, known as Data Release 9 or DR9, has been preceded by eight other data releases. The data have already spawned several disparate research projects, including citizen science efforts that utilize the wisdom of crowds.
Dey, along with Schlegel, is a part of a research effort that uses a machine-learning algorithm to automatically identify light-bending phenomena known as gravitational lenses in the DESI surveys data, for example.
Aaron Meisner, a NOIRLab researcher and DESI participant, is also involved in the lensing study and in a citizen science project called Backyard Worlds: Planet 9, which calls for the general public’s help in finding a possible ninth planet in our solar system by studying space images. Participants have already found numerous new brown dwarfs, which are small, cold stars unable to sustain fusion burn.
“The imaging data provides a deep resource that is essential to carry out DESI’s unique mission while giving the scientific community access to an extraordinary dataset,” said DESI Director Michael Levi, a senior scientist at Berkeley Lab. “We look forward to using these imaging data to yield new clues and reveal the secrets of our expanding universe.”
NERSC is a DOE Office of Science User Facility.
This research is supported by the Director, Office of Science, Office of High Energy Physics of the U.S. Department of Energy, by the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility, by the U.S. National Science Foundation, Division of Astronomical Sciences, and by the Chinese Academy of Sciences.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.
Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 14 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.
The U.S. National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 to promote the progress of science. NSF supports basic research and people to create knowledge that transforms the future.
The NSF’s National Optical-Infrared Astronomy Research Laboratory (NOIRLab) is the U.S. center for ground-based optical-infrared astronomy, operating multiple research facilities including Kitt Peak National Observatory (KPNO) and Cerro Tololo Inter-American Observatory (CTIO). The Laboratory is operated by the Association of Universities for Research in Astronomy (AURA) under a cooperative agreement with NSF’s Division of Astronomical Sciences.
The Chinese Academy of Sciences (中国科学院) is the national academy for the natural sciences of the People’s Republic of China. It has historical origins in the Academia Sinica during the Republican era and was formerly also known by that name. CAS is the world’s largest research organization, comprising around 60,000 researchers working in 114 institutes, and has been consistently ranked among the top research organizations around the world.
Established in 1958 and aiming at the forefront of astronomical science, the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC) conducts cutting-edge astronomical studies, operates major national facilities and develops state-of-the-art technological innovations.
The Mayall Telescope at Kitt Peak National Observatory and the Blanco Telescope at Cerro Tololo Inter-American Observatory are operated by the Association of Universities for Research in Astronomy (AURA) under cooperative agreement with the National Science Foundation. The Bok Telescope is located on Kitt Peak and operated by Steward Observatory, University of Arizona. The authors are honored to be permitted to conduct astronomical research on Iolkam Du’ag (Kitt Peak), a mountain with particular significance to the Tohono O’odham.
This research used resources of the National Energy Research Scientific Computing Center, which is supported by the U.S. Department of Energy Office of Science under Contract No. DE-AC02-05CH11231.
For more details, please see https://www.legacysurvey.org/acknowledgment/.http://www.interactions.org/index.rss )
Lawrence Berkeley National Laboratory
In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.
Berkeley Lab is a multidisciplinary national laboratory located in Berkeley, California on a hillside directly above the campus of the University of California at Berkeley. The site consists of 76 buildings located on 183 acres, which overlook both the campus and the San Francisco Bay.
1 Cyclotron Road
IceCube Collaboration awarded 2021 Rossi Prize
2021-01-13T21:27:54Z via NavierStokesApp To: Public
The status of supersymmetry
2021-01-12T15:29:08Z via NavierStokesApp To: Public
"The status of supersymmetry"
Once the most popular framework for physics beyond the Standard Model, supersymmetry is facing a reckoning—but many researchers are not giving up on it yet.
The Standard Model of particle physics is both fantastically successful and glaringly incomplete.
Its predictions have pieced together many of the known features of the universe and guided physicists to new discoveries, such as the Higgs boson. But it cannot account for the existence of dark matter—the mysterious substance that makes up 85% of the universe’s matter—or explain the Higgs boson’s mass.
How can scientists fill the gaps? For decades, a set of theories collectively known as supersymmetry seemed to provide an elegant solution.
Supersymmetry more than doubles the number of particles in the Standard Model. The particles we currently know to exist can be divided into two categories: fermions and bosons. In supersymmetric theories, each particle has an as-yet undiscovered “superpartner” with many similar properties. Fermions are paired with bosons and vice-versa.
The idea of a symmetry between fermions and bosons originated in the early 1970s to address a mathematical issue with string theory. In 1974, Julius Wess and Bruno Zumino discovered that a broad class of quantum field theories could be made supersymmetric through a generalization of the symmetries of relativity. Soon researchers devised theories in which a particle and its superpartner could have different masses.
In the early 1980s, theorists realized that the Standard Model itself could be made supersymmetric and that this extension would resolve some vexing problems with the theory. For example, the small mass of the Higgs boson is notoriously difficult to explain—its calculation requires subtracting two very large numbers that just happen to be slightly different from each other. “But if you add supersymmetry, this takes care of all these cancellations such that you can get a light Higgs mass without needing to have such luck,” says Elodie Resseguie, a postdoc at the US Department of Energy’s Lawrence Berkeley National Laboratory.
Along with an explanation for the Higgs mass, supersymmetry offered other theoretical advantages. The lightest supersymmetric particle would be a promising dark matter candidate. And the strengths of the electromagnetic force, the weak force and the strong force would become equal at extremely high energies, suggesting that the fundamental forces we observe today were unified in the early universe. “It really is a very beautiful theory,” says Michael Peskin, a theorist at SLAC National Accelerator Laboratory.
The simplest supersymmetric theories—those that best explain the Higgs boson—predict a zoo of new particles with masses comparable to those of the W and Z bosons. Those were within reach of the Large Hadron Collider, so when it turned on in 2009, many particle physicists thought the discovery of superpartners was imminent. But after the triumphant discovery of the Higgs boson came … no more new fundamental particles.
“I was shocked when supersymmetric particles were not discovered in the early days of the LHC,” Peskin says.
Not all theorists were caught by surprise. “There were many people who were loudly saying that there was something wrong with the basic picture of supersymmetry well before the LHC,” says Nima Arkani-Hamed, a theorist at the Institute for Advanced Study in Princeton, New Jersey. “You would have thought that if all these particles were lying around not much heavier than where we’ve been, they would leave some indirect effects in low-energy physical process.”
Previous experiments at the Large Electron-Positron Collider, which operated from 1989 to 2000, had already cast doubt on the simplest supersymmetric models, Arkani-Hamed says.
Jim Gates, a theorist at Brown University and president-elect of the American Physical Society, says he never expected supersymmetry to show up at the LHC. For decades, he has maintained that the most plausible supersymmetric theories predict superpartners too heavy to be discovered with current accelerators.
As data from the LHC has continued to accumulate, the models of supersymmetry originally favored by the community have been largely ruled out. For example, gluinos, the hypothesized superpartners of gluons, have been excluded up to a mass of 2 trillion electronvolts—an order of magnitude larger than many theorists had hoped. It appears increasingly unlikely that supersymmetry could include all three features—an explanation for the Higgs mass, a dark matter particle and force unification—found in the pre-LHC models.
The lack of evidence for supersymmetry at the LHC does not signify a death knell for the idea. Nevertheless, “now the community is going off in a large number of different directions,” Peskin says. “We’re all pretty confused right now.”
If supersymmetry does exist, there are two main possibilities: Either all the supersymmetric particles are too heavy to be produced at the energies accessible to current particle accelerators, as Gates suspects—or superpartners are created in collisions at the LHC, but for some reason they escape detection.
In the second case, “people are looking for new models which produce exotic signatures we haven’t looked for in the past, or they’re looking for models where the signatures are experimentally more challenging, which is a reason we haven’t been able to set as strong limits in the past,” says Resseguie, a member of the ATLAS collaboration.
For instance, most searches for new particles at the LHC have assumed that they would decay almost immediately after being created, giving them no time to travel away from the interaction point. But several unconventional supersymmetric theories predict long-lived superpartners. These particles would travel between a few micrometers and few hundred thousand kilometers before decaying. Ongoing projects on both the ATLAS and CMS experiments seek to pick up the trail of long-lived particles. US participation in ATLAS and CMS receives support from the US Department of Energy’s Office of Science and the National Science Foundation.
Researchers on ATLAS and CMS are also looking for superpartners that would decay into low-energy Standard Model particles. “It’s a very challenging search because we can’t use the standard techniques that we use for most of the other supersymmetry searches,” says Christian Herwig, a postdoc at Fermi National Accelerator Laboratory who works on the CMS experiment.
Even without supersymmetry, low-energy particles are abundant in the detectors, so researchers must devise clever ways of separating this irrelevant background from interactions that hint at supersymmetry.
One framework guiding current searches for superpartners is split supersymmetry, which Arkani-Hamed and a colleague proposed back in 2004. Split supersymmetry gives lightweight superpartners to half of the Standard Model particles and heavy superpartners to the rest.
Arkani-Hamed views split supersymmetry as the most promising theory given current data. Still, “theorists are not married to things. We’re trying to discover the truth,” he says. “So we pick up ideas and explore them to see what they imply, and we let experiment decide.”
Although split supersymmetry offers a dark matter candidate and unifies the fundamental forces at high energies, it does not address the stability of the Higgs boson, leaving some theorists doubtful. “My number-one priority is to solve the Higgs problem, and I don’t see that split supersymmetry solves that problem,” Peskin says.
Faced with the dearth of experimental evidence for supersymmetry, he is now exploring alternative explanations for the properties of the Higgs boson. Just as protons are made up of quarks and gluons, Peskin suspects that the Higgs boson may have a hidden substructure.
The saga of supersymmetry “should be taken as a caution,” Gates says. “This, unfortunately, is an example where the particle physics community got its head out over its skis. We should always be extraordinarily mindful that nature is indeed subtle, and we have to take our cues from nature.”
The search continues
As segments of the particle physics community have drifted away from supersymmetry, many experimentalists remain optimistic. “We’re doing things with our detector now that we never thought would have been possible when we built it,” Herwig says. “Being able to do these things really opens up entirely new possibilities and analysis strategies that we’re working to implement for the next years of data-taking.”
By the late 2020s, an upgrade to the LHC known as the High-Luminosity LHC will allow experimentalists to explore uncharted areas of the supersymmetric landscape.
Future colliders that will probe even higher energies could turn up superpartners, too. But instead of being motivated by the search for supersymmetry, “the reason for building the next colliders is to study the Higgs boson to death, full stop,” Arkani-Hamed says.
Peskin agrees that learning more about the Higgs is crucial to understanding physics beyond the Standard Model. “Almost every theory of the Higgs boson is consistent with current data,” and so none of them can be ruled out, he says. “We really don’t know anything about it.”
It could be decades before physicists know the truth about supersymmetry. If superpartners exist, Gates says that up to a century could pass before their discovery.
But “we know how to be patient as a community,” Herwig says.
For neutrinos, the path from theoretical prediction to experimental observation took 25 years. For the Higgs boson, it took a half-century. And for gravitational waves, it took a full 100 years.
Supersymmetry may not solve all the problems that researchers initially hoped it would. But Resseguie is excited to forge ahead in the quest to understand nature: “Either supersymmetry is the answer, or it is not, but the only way we’ll find out is if we keep looking.”
CMS collaboration releases its first open data from heavy-ion collisions
2021-01-12T14:28:16Z via NavierStokesApp To: Public
"CMS collaboration releases its first open data from heavy-ion collisions"
CMS collaboration releases its first open data from heavy-ion collisionsachintya Tue, 01/12/2021 - 15:04http://home.web.cern.ch/about/updates/feed )
For a few weeks each year of operation, instead of colliding protons, the Large Hadron Collider (LHC) collides nuclei of heavy elements (“heavy ions”). These heavy-ion collisions allow researchers to recreate in the laboratory conditions that existed in the very early universe, such as the soup-like state of free quarks and gluons known as the quark–gluon plasma. Now, for the first time, the Compact Muon Solenoid (CMS) collaboration at CERN is making its heavy-ion data publicly available via the CERN Open Data portal.
Over 200 terabytes (TB) of data were released in December, from collisions that occurred in 2010 and 2011, when the LHC collided bunches of lead nuclei. Using these data, CMS had observed several signatures of the quark–gluon plasma, including the imbalance between the momenta of each jet of particles produced in a pair, the suppression (“quenching”) of particle jets in jet–photon pairs and the “melting” of certain composite particles. In addition to lead–lead collision data (two data sets from 2010 and four from 2011), CMS has also provided eight sets of reference data from proton–proton collisions recorded at the same energy.
The open data are available in the same high-quality format that was used by the CMS scientists to publish their research papers. The data are accompanied by the software that is needed to analyse them and by analysis examples. Previous releases of CMS open data have been used not only in education but also to perform novel research. CMS is hopeful that communities of professional researchers and amateur enthusiasts as well as educators and students at all levels will put the heavy-ion data to similar use.
“Our aim with releasing CMS data into the public domain via the Creative Commons CC0 waiver is to preserve our data and the knowledge needed to use them, in order to facilitate the widest possible use of our data,” says Kati Lassila-Perini, who has led the CMS open-data project since its inception in 2012. “We hope that those outside CMS will find these data as fascinating and valuable as we do.”
CMS has committed to releasing 100% of the data recorded each year after an embargo period of ten years, with up to 50% of the data being made available in the interim. The embargo allows the researchers who built and operate the CMS detector adequate time to analyse the data they collect. With this release, all of the research data recorded by CMS during LHC operation in 2010 and 2011 is now in the public domain, available for anyone to study.
You can read more about the release on the CERN Open Data portal: opendata.cern.ch/docs/cms-releases-heavy-ion-data
Martinus Veltman (1931-2021)
2021-01-11T08:27:35Z via NavierStokesApp To: Public
"Martinus Veltman (1931-2021)"
Martinus Veltman (1931-2021) thortala Fri, 01/08/2021 - 15:38http://home.web.cern.ch/about/updates/feed )
Martinus “Tini” Veltman, who shared the 1999 Nobel Prize in Physics with his former student Gerardus ‘t Hooft, passed away on 4 January at the age of 89. A regular visitor to CERN since the early 1960s, Veltman served on the Scientific Policy Committee from 1976 to 1982, where he was a staunch supporter of the CERN model for intergovernmental research, contributing much to the scientific direction of the Laboratory. It is through his contributions to the Standard Model, our current description of elementary particles and their interactions, that he had the most profound influence on the research of CERN. His 1970s work with ’t Hooft on the renormalisation of spontaneously broken Yang-Mills theories is a cornerstone of the Standard Model. Their work provided decisive support to the Weinberg-Salam theory as a realistic description of weak and electromagnetic interactions, with the inclusion of a new particle: a particle whose existence had been anticipated several years before, and that is now commonly known as the Higgs boson. With his departure, we have lost one of the founding fathers of modern particle physics. A full obituary will appear in the CERN Courier.
Week 52 at the Pole
2021-01-07T23:27:31Z via NavierStokesApp To: Public
The uncertain future of North America’s ash trees, and organizing robot swarms
2021-01-07T19:27:32Z via NavierStokesApp To: Public
"The uncertain future of North America’s ash trees, and organizing robot swarms"
Freelance journalist Gabriel Popkin and host Sarah Crespi discuss what will happen to ash trees in the United States as federal regulators announce dropping quarantine measures meant to control the emerald ash borer—a devastating pest that has killed tens of millions of trees since 2002. Instead of quarantines, the government will use tiny wasps known to kill the invasive beetles in hopes of saving the ash. Sarah also talks with Pavel Chvykov, a postdoctoral researcher at the Massachusetts Institute of Technology, about the principles for organizing active matter—things like ant bridges, bird flocks, or little swarms of robots. This week’s episode was produced with help from Podigy. Listen to previous podcasts. About the Science Podcast Download a transcript (PDF).http://www.sciencemag.org/rss/podcast.xml )
High school teachers, meet particle physics
2021-01-05T22:27:30Z via NavierStokesApp To: Public
"High school teachers, meet particle physics"
Workshops around the world train science teachers to incorporate particle physics into their classrooms.
Picture this: A stationary object such as a vase suddenly explodes, sending fragments flying. Given the final energies and momenta of the fragments, can you figure out the mass the object had before it broke apart?
Dave Fish gives his students this common conservation-of-momentum problem—with a twist. Instead of describing the explosion of a macroscopic object like a vase, he describes the transformation of a top quark and top antiquark into other fundamental particles.
Fish teaches high school physics and is a teacher-in-residence at Perimeter Institute for Theoretical Physics in Ontario. In Canada, particle physics is “one of those things that teachers tend to leave until the end of the course, and then they run out of time,” Fish says. “Most of us, as high school teachers, feel overwhelmed by the content.”
Particle physics makes an appearance in the curriculum for the International Baccalaureate, a program recognized as a qualification for entry into higher education by many universities around the world. The subject also appears in some state curricula, like that of North Rhine-Westphalia in Germany. But in general, “there are not a lot of curricula that feature particle physics explicitly,” says Jeff Wiener, the Teacher Programmes Manager at CERN. “Those that do usually focus on the rather boring stuff like: ‘Name two leptons.’”
Putting particles in the curriculum
Many high school science teachers who would like to teach particle physics say they feel insufficiently knowledgeable about the subject or are unsure how to include it without sacrificing required curricular topics.
Fish and Wiener are two of the many people who hope to change that. They see many opportunities for integrating particle physics into standard curricula focused on general physics concepts. To teach conservation of momentum, try using real data from the discovery of the top quark (an activity developed by educators at the US Department of Energy’s Fermi National Accelerator Laboratory). To demonstrate the motion of charged particles in magnetic fields, show photographs from particle detectors called bubble chambers. To provide an example of circular motion, discuss the mystery of dark matter.
One of Fish’s former students, Nikolina Ilic, considers a project on dark matter she undertook in his class to be a turning point in her education. “I realized that we don’t know what 95% of the universe is made of, and this was mind-blowing to me,” she says. “That was when I decided to pursue particle physics.”
Ilic went on to do her PhD research at CERN, where she contributed to the statistical analysis for the discovery of the Higgs boson.
“Let’s talk about process, let’s talk about how particle physicists analyze data, let’s talk about how they problem-solve.”
In the years when he does not teach high school students, Fish leads workshops at Perimeter Institute to help other teachers bring particle physics into their classrooms. Each year, about 40 or 50 teachers from Canada and other countries attend a weeklong EinsteinPlus workshop, participating in a variety of collaborative activities meant to teach them about modern physics. One of the most popular is a card-sorting game that teaches the patterns and symmetries of the Standard Model. In each activity, “we ask the teachers to be the students and ask the questions that students would ask,” Fish says.
Fermilab runs similar teacher workshops covering various physics topics for elementary through high school teachers.
As the COVID-19 pandemic has forced many programs to move online, Fermilab has leaned into finding ways to interact with teachers and students virtually. “We have career talks with lab staff, classroom presentations we are creating with teachers and running virtually, Virtual Ask-a-Scientist, and Saturday Morning Physics,” says Amanda Early, Fermilab’s education program leader who manages the K-12 physical science programs.
Each year, Fermilab runs programs for educators and students, engaging them with Fermilab science. "The more you expose students to particle physics—the sheer size and scale of it and its benefits—the more opportunities kids will see to engage in science," Early says.
This year, one of the education group’s Summer Secondary Science Institutes specifically focused on helping high school teachers tailor modern physics lessons to the Next Generation Science Standards used in many US states. Around 80 teachers from the Chicago area and across the country participated in the interactive five-day workshop, this year provided online.
The Next Generation Science Standards do not mention particle physics explicitly. But the crosscutting concepts and science and engineering practices that frame them mesh well with the subject, says David Torpe, an Illinois high school science teacher who has taught professional development workshops at Fermilab for six years.
“Let’s talk about process, let’s talk about how particle physicists analyze data, let’s talk about how they problem-solve,” Torpe says. “The ideas of energy and cause and effect fit in naturally, too. I think a good strategy is to find a little bit of particle physics that you find interesting and pop it in here or there.”
Bringing teachers to CERN, and CERN to teachers
Across the pond in Europe, CERN’s Teacher Programmes attract over 1000 high school teachers from around the world to Geneva annually. Between physics lectures, the teachers tour laboratories and have Q&A sessions with CERN scientists.
“The idea was that upon returning to Mexico, we would be ambassadors and encourage some students to see that it’s possible to go do research at CERN,” says Eduardo Morales Gamboa, who attended the Spanish-Language Teacher Programme in 2019.
Since visiting the enormous CMS detector and seeing particle tracks in a homemade cloud chamber, he has incorporated particle physics—and the many useful applications that have stemmed from it—into his classroom discussions of the intersections of science, technology and society. Eventually, he says, he hopes to build a cloud chamber with his students.
According to Wiener, Morales Gamboa’s experience is a common one. Many alumni of the Teacher Programmes even come back to CERN, this time with their students along for the trip, to spark the next generation’s enthusiasm for particle physics.
The success of CERN’s outreach efforts stems in part from integration with physics education research. Indeed, CERN’s Teacher Programmes are designed to give attendees knowledge not just of particle physics, but also of pedagogical best practices for teaching science.
One such practice is taking students through “predict-observe-explain” cycles. “You encourage students to make a prediction of what is going to happen before they do the experiment. This way, you make sure that they first activate their previous knowledge and get curious about the outcome,” says Julia Woithe, who coordinates hands-on learning labs at CERN. “Then, if they are surprised by the observed outcome, they need to figure out in a team how to explain the differences between their predictions and observations. This usually leads to a powerful ‘eureka!’ moment.”
In addition to running events at CERN, Wiener traveled to India to collaborate with educators from the International School of Geneva in the first South Asia Science Education Programme last year. Eighty teachers from the region attended the weeklong program at Shiv Nadar School Noida in New Delhi.
Vinita Sharat, the school’s STEAM coordinator, has taught particle physics for a decade but remembers facing resistance at first from the organizations where she previously worked. “The first challenge is to change the mindset of authority,” she says. “They asked why I was teaching it since it’s not in the scope of the syllabus.”
Her students, on the other hand, had no qualms. Some found particle physics so fascinating that they would stay online until midnight discussing quarks and leptons with Sharat. “Students will always be ready to learn something related to nature,” she says.
Sharat fosters students’ creative sides in her lessons about particle physics by encouraging them to write poems, make videos or choreograph dances to explain the concepts they are studying. Like Fish, Sharat has stayed in contact with several former students whom she inspired to pursue a career in physics.
“The base of everything”
After CERN’s program at her school, Sharat hopes that more teachers across South Asia will bring particle physics into their classrooms. And Wiener plans to lead more teaching workshops across the world in the future.
For now, COVID-19 has brought in-person professional development workshops to a halt. But teachers can still access some resources online: CERN’s hands-on learning lab S’Cool LAB (run until recently by Woithe), Perimeter Institute, Fermilab and QuarkNet offer free downloads of their interactive teaching materials.
For Morales Gamboa, the benefits of teaching particle physics in high school extend beyond inspiring a few students to pursue careers in the field. Talking about the connections to engineering shows how abstract scientific ideas relate to everyday life, while describing the massive international projects conveys the spirit of collaboration key to modern science.
Stacy Gates, an Illinois high school science teacher who taught Fermilab’s Summer Secondary Physics Institute alongside Torpe this year, emphasizes that teaching particle physics fosters critical thinking. “I encourage my students to question me when they don’t believe that particles can behave a certain way,” she says. “That’s such an important skill because that’s what scientists do. They question everything, and they try to prove and disprove.”
Sharat agrees that particle physics holds valuable lessons. No matter where her students go in life, she wants them to understand that “particle physics is the base of everything,” she says.
“We should know the reason of our existence. We should know what we are made of.”
Areas to watch in 2021, and the living microbes in wildfire smoke
2020-12-31T19:27:21Z via NavierStokesApp To: Public
"Areas to watch in 2021, and the living microbes in wildfire smoke"
We kick off our first episode of 2021 by looking at future trends in policy and research with host Meagan Cantwell and several Science news writers. Ann Gibbons talks about upcoming studies that elucidate social ties among ancient humans, Jeffrey Mervis discusses relations between the United States and China, and Paul Voosen gives a rundown of two Mars rover landings. In research news, Meagan Cantwell talks with Leda Kobziar, an associate professor of wildland fire science at the University of Idaho, Moscow, about the living component of wildfire smoke—microbes. The bacteria and fungi that hitch a ride on smoke can impact both human health and ecosystems—but Kobziar says much more research is needed. This week’s episode was produced with help from Podigy. Listen to previous podcasts. About the Science Podcast Download the transcript (PDF)http://www.sciencemag.org/rss/podcast.xml )