The Global Community of Particle Physics
This account brings you hot items from public particle physics news sources, including CERN, SymmetryMagazine.org, and Interactions.org.
Making science more equitable, starting with 101
2020-07-07T17:27:49Z via NavierStokesApp To: Public
"Making science more equitable, starting with 101"
A new collaborative project aims to make introductory STEM courses successful for everyone.
Physicist Tim McKay has taught enough introductory physics courses to know what many university students think about them: They are difficult. You will get a lower grade in them than in your other courses. And worst of all: If you don’t do well, then you probably weren’t meant to study science after all.
Studies have shown that those who face the worst consequences from this mentality are those who are already less likely to be found in many STEM fields: women, underrepresented minorities and students from low-income backgrounds.
In higher education, this is no secret. But creating the cultural shift needed to understand the core of these issues requires effort and collaboration across institutions, not to mention buy-in from the top down.
Luckily, McKay, who has been a professor at the University of Michigan since 1995, is up for the challenge. And he’s equipped with the not-so-secret tool that every physicist turns to for answers: data.
After cutting his teeth early in his career on the Sloan Digital Sky Survey—a collaboration of hundreds of scientists who studied the universe using telescope data—he understood the value of an experiment that brought together top interdisciplinary minds to find new insights in huge datasets.
For the past decade, he has used those tools to study inequities in education. Now, as head of the Sloan Equity and Inclusion in STEM Introductory Courses (SEISMIC) project, he has brought together 180 people from 10 public research universities to understand what makes STEM courses inequitable among students.
Not only that, the group is working to create concrete solutions that shift the way these courses are taught and to create a new way for professors and university leadership to think about education. In a year where education systems across the country are being disrupted and examined, members say that perhaps now is the time to create real change.
“We have to change the culture,” says McKay, who is now an associate dean for undergraduate education at the University of Michigan. “I’d really like for students to take intro to science courses and come out feeling like they had real success, like they were set up to learn the deep roots of the field, rather than feeling like they got through by the skin of their teeth and didn’t understand anything.”
Shifting away from STEM ‘grade penalties’
For many students at large research universities, intro-to-STEM courses involve large lecture halls filled with hundreds of students, all watching a single professor write notes and explain the basics of the field at the front of the room—a scenario that has played out the same way for decades.
For some students, this works fine. As an undergraduate at Temple University in the late 1980s, McKay himself found his way into physics through courses just like these. What inspired him was his enthusiastic, magnetic instructor, Jack Crow, who got him excited about the consistency and logic of physics.
When McKay first became a professor, he tried to emulate that enthusiasm, thinking it was enough to inspire his students. “But I found that teaching is a craft that you learn how to do, and you learn through evidence, not instinct,” he says. “Enthusiasm is not a terrible approach, but it’s far from the most effective.”
He began experimenting in his courses, giving students electronic clickers to answer questions during class. He found that it engaged students who would otherwise passively sit in the back and listen. When he became director of the honors program within Michigan’s College of Literature, Science and the Arts in 2008, he realized he had no idea what sort of pedagogy was working well across disciplines. To find out, he turned to data.
He knew that students tended to get lower grades in intro-to-STEM courses, but he wanted to examine how those grades fared against the average grade they received in all of their other classes. In other words, what were the grade penalties associated with those courses? And were those penalties the same for different groups of students?
First taking a look at penalties for male versus female students, he found that in intro to physics courses, the grade point penalty for male students was .3, while the penalty for female students was .6.
Why was this? Female students tended to outperform male students across disciplines at Michigan, except in the large introductory science and math courses. McKay wanted to know: Was this just happening at Michigan? He began recruiting other universities to ask the same question, and they found that the patterns were the same everywhere.
“These inequities were systemic,” he says. “It’s shameful to discover that you’ve been teaching a course that has this outcome. Once I knew it was everywhere, I thought: I have to do something about this.”
Understanding the problem of inequity
With funding from the Sloan Foundation, McKay created SEISMIC. Since 2019, the group has divided into working groups focused on four topics: measurements, structures, experiments and constructs.
Because of McKay’s findings at Michigan, one of their first targets was the gender gap in intro-to-STEM courses, says Sehoya Cotner, a biology education professor at the University of Minnesota and one of the leaders of the experiments working group.
In large into-to-STEM lecture courses, high-stakes, timed exams are often the evaluation tool professors use to gauge whether students understand the concepts. Cotner’s research—and research at other universities—have shown that female students often perform worse on these types of tests compared to male students.
Scientists have investigated several factors that could contribute to this gap, such as stereotype threat: performance-diminishing anxiety related to a fear of confirming a negative stereotype about a group of which you are a member. Female students are more likely to report experiencing test anxiety than male students, but the cause of this gap likely involves myriad factors and will need more research, Cotner says.
Still, the group is working to find alternative ways to evaluate students in introductory STEM courses other than timed tests. These evaluations could involve tasks more akin to authentic physics research, such as writing out an analysis or working on a group project.
They’re also looking at other interventions to deal with student anxiety, such as early discussions about how challenges in these courses are normal, temporary and surmountable.
Though SEISMIC began by looking at gender, participants are also looking at how these interventions could close gaps between other groups. At the University of Minnesota, Cotner found that “belongingness” discussions lowered the performance gap in an Introduction to Chemistry course between students from groups that are underrepresented at the institution (students who identified as Hispanic/Lantinx, Black, Native American or Pacific Islander) and students who identified as white or Asian.
The SEISMIC group has begun to expand their focus—looking at how gender, race, income and first-generation status all affect students. They want to take a nuanced approach, and early results show that students at the intersections of these different minority groups stand to lose the most from grade penalties.
“We are working beyond gender and looking at what works for certain subjects and in certain situations,” Cotner says. “We want to contextualize it and give our colleagues in STEM information that they can hang their hats on. It’s not fuzzy or theoretical—we want to create concrete, actionable information that will lower barriers and reduce inequities.”
Shifting the course from ‘prior opportunity’
For the past 25 years—first as a chemistry instructor and now as assistant vice provost for educational effectiveness at University of California, Davis—SEISMIC member Marco Molinaro has been gathering data about inequities among students from different backgrounds and neighborhoods.
“Especially in introductory courses,” he says. “They are the Achilles’ heels of STEM education. When you have students that are first-generation, low-income and underrepresented minorities, they can be lost at three times the rate of white students who have benefitted from greater prior opportunities to learn the material. Even if they stay in the class, they are often a full course grade behind other students.”
As one of the leaders of SEISMIC’s structures working group, he is helping to find the right data and approach to communicate and effect change at the structural level. The group is looking at datasets including regional opportunity indexes, census information and estimated income potential. They want to see how students’ backgrounds—and the opportunities or lack of opportunities those backgrounds afforded them—relate to how they perform at the university level. In other words, how their “prior opportunity” level affects future opportunities.
“We are looking at how to change the discussion,” he says. “What is really happening is that their prior opportunity seems to be continuing once they get to college. Now we are looking to find instances of introductory STEM courses where students outperform their prior opportunity level and see how we can replicate that success across institutions.”
Part of SEISMIC’s goal is to not only find the right data, but communicate it in a way that will get even the most stalwart professors and leaders to change their thinking.
For example, when McKay talks to other STEM professors about gender inequities in an intro-to-STEM course, many of them dismiss the idea that the data reveal a problem with the course itself. “Their first instinct is to explain it away.”
Instead, he often first presents the data as coming from students that took the same course but were taught in different classes by different instructors. Professors are always interested in correcting inequities between these classes, McKay says.
Molinaro’s group is now working to create a how-to manual for institutions to examine their own behaviors and structural inequities in STEM courses. “If we can start to answer the question ‘What do we do now?’, that’s where I think we’ll have success,” he says.
Seizing the moment to shift the culture
While SEISMIC was launched in 2019, members say they are seizing this moment—in which educators, students and parents are reckoning with how to work with students during a pandemic—to make their case.
“The pandemic has forced everyone to reconsider and perhaps change the basic structures of their courses,” McKay says. “We know that the pandemic has hit Black [and other ethnic minority] communities much harder than others, and we fear that all of the disruption anticipated for this year will make things worse. It’s an opportunity for us to move forward with this colossal shift that is needed. We can find new modes of evaluations that instructors like and that will help solve equity issues.”
For Cotner, who lives in the Minneapolis-St. Paul area, where the police killing of George Floyd and resulting unrest has brought the issue of systemic racism and social inequities to the forefront, a project like SEISMIC is one way she feels like she can effect real change.
“I can give money, I can go to rallies, but I can also use my training and funding to try to do this one tiny part, which is make science education more equitable,” she says. “We want more people of color to become principal investigators and leaders in science. It’s what we can do in our little part of the world.”
An oasis of biodiversity a Mexican desert, and making sound from heat
2020-07-02T18:28:25Z via NavierStokesApp To: Public
"An oasis of biodiversity a Mexican desert, and making sound from heat"
First up this week, News Intern Rodrigo Pérez-Ortega talks with host Meagan Cantwell about an oasis of biodiversity in the striking blue pools of Cuatro Ciénegas, a basin in northern Mexico. Researchers have published dozens of papers exploring the unique microorganisms that thrive in this area, while at the same time fighting large agricultural industries draining the precious water from the pools. David Tatnell, a postgraduate researcher at the University of Exeter, talks with host Sarah Crespi about using heat to make sound, a phenomenon known as thermoacoustics. Just like the sound of fire or thunder, sudden changes in temperature can create sound waves. In his team’s paper in Science Advances, Tatnell and colleagues describe a thermoacoustic speaker that uses thin, heated films to make sound. This approach cuts out the crosstalk seen in mechanical speakers and allows for extreme miniaturization of sound production. In the ultrasound range, arrays of thermoacoustic speakers could improve acoustic levitation and ultrasound imaging. In the hearing range, the speakers could be made extremely small, flexible, and even transparent. This week’s episode was produced with help from Podigy. Listen to previous podcasts. About the Science Podcast [Image: David Jaramillo; Music: Jeffrey Cook]http://www.sciencemag.org/rss/podcast.xml )
Week 25 at the Pole
2020-07-02T17:28:26Z via NavierStokesApp To: Public
Sun’s shadow on IceCube shines light on solar magnetic field
2020-07-01T14:28:48Z via NavierStokesApp To: Public
"Sun’s shadow on IceCube shines light on solar magnetic field"
The IceCube Collaboration recently performed an analysis to try to expand our understanding of the solar magnetic field by studying the time-dependent cosmic-ray Sun shadow. They also wanted to explore how the cosmic-ray Sun shadow changes at different energy regimes. The results, recently submitted to Physical Review D, show that more solar activity leads to a weaker Sun shadow. There were also indications that, in times of high solar activity, the shadow becomes stronger at higher energies—a hint at Sun-shadow energy dependence that will be explored more in future studies.http://icecube.wisc.edu/news/feed )
LHCb discovers a new type of tetraquark at CERN
2020-07-01T09:28:52Z via NavierStokesApp To: Public
"LHCb discovers a new type of tetraquark at CERN"
LHCb discovers a new type of tetraquark at CERN cagrigor Tue, 06/30/2020 - 17:23http://home.web.cern.ch/about/updates/feed )
The LHCb collaboration has observed a type of four-quark particle never seen before. The discovery, presented at a recent seminar at CERN and described in a paper posted today on the arXiv preprint server, is likely to be the first of a previously undiscovered class of particles.
The finding will help physicists better understand the complex ways in which quarks bind themselves together into composite particles such as the ubiquitous protons and neutrons that are found inside atomic nuclei.
Quarks typically combine together in groups of twos and threes to form particles called hadrons. For decades, however, theorists have predicted the existence of four-quark and five-quark hadrons, which are sometimes described as tetraquarks and pentaquarks, and in recent years experiments including the LHCb have confirmed the existence of several of these exotic hadrons. These particles made of unusual combinations of quarks are an ideal “laboratory” for studying one of the four known fundamental forces of nature, the strong interaction that binds protons, neutrons and the atomic nuclei that make up matter. Detailed knowledge of the strong interaction is also essential for determining whether new, unexpected processes are a sign of new physics or just standard physics.
“Particles made up of four quarks are already exotic, and the one we have just discovered is the first to be made up of four heavy quarks of the same type, specifically two charm quarks and two charm antiquarks,” says the outgoing spokesperson of the LHCb collaboration, Giovanni Passaleva. “Up until now, the LHCb and other experiments had only observed tetraquarks with two heavy quarks at most and none with more than two quarks of the same type.”
“These exotic heavy particles provide extreme and yet theoretically fairly simple cases with which to test models that can then be used to explain the nature of ordinary matter particles, like protons or neutrons. It is therefore very exciting to see them appear in collisions at the LHC for the first time,” explains the incoming LHCb spokesperson, Chris Parkes.
The LHCb team found the new tetraquark using the particle-hunting technique of looking for an excess of collision events, known as a “bump”, over a smooth background of events. Sifting through the full LHCb datasets from the first and second runs of the Large Hadron Collider, which took place from 2009 to 2013 and from 2015 to 2018 respectively, the researchers detected a bump in the mass distribution of a pair of J/ψ particles, which consist of a charm quark and a charm antiquark. The bump has a statistical significance of more than five standard deviations, the usual threshold for claiming the discovery of a new particle, and it corresponds to a mass at which particles composed of four charm quarks are predicted to exist.
As with previous tetraquark discoveries, it is not completely clear whether the new particle is a “true tetraquark”, that is, a system of four quarks tightly bound together, or a pair of two-quark particles weakly bound in a molecule-like structure. Either way, the new tetraquark will help theorists test models of quantum chromodynamics, the theory of the strong interaction.
Read more on the LHCb website.
Hundreds of hadrons
2020-06-30T15:28:42Z via NavierStokesApp To: Public
"Hundreds of hadrons"
Hadrons count among their number the familiar protons and neutrons that make up our atoms, but they are much more than that.
In the early 1900s, physicists were trying to find the source of a low-level buzz of radiation that seemed to be present at all times, everywhere around the world. The reigning theory was that all of it came from the Earth itself.
That quickly changed, however, when the Swiss physicist Albert Gockel used a hot air balloon to take measurements of the radiation far above sea level. To his surprise, it actually increased with altitude. This strange, pervasive energy, it seemed, was coming from above even more than from below. The source? High-energy particles from space called cosmic rays.
To detect these particles, physicists began taking to the skies in balloons and refitted war planes; hiking up precarious mountainsides; and side-stepping gaping crevasses in high-elevation glaciers. They brought with them cloud chambers, transparent containers full of dense water or alcohol vapor.
When a cosmic ray collides with an atom in a cloud chamber, a shower of smaller subatomic particles careen off in all directions. These charged particles ionize vapor molecules, which become visible as thin wisps of condensation. (See our article on how to make your own cloud chamber.)
Using cloud chambers—and later detectors filled with liquid hydrogen—physicists began to realize that there are far more particles than they had initially suspected.
They discovered muons, pions, kaons and lambda particles. Then, starting in the 1930s when the first particle accelerators began operation, physicists found themselves inundated with even more new subatomic particles.
“There were so many of these particles,” says Brian Shuve, a physicist at Harvey Mudd College, “that it seemed very unlikely that they were all elementary.”
That hunch turned out to be correct. Most of the new particles would eventually be classified as “hadrons,” composite particles made up of even tinier constituents called quarks.
Hadrons include such all-star members as the protons and neutrons that make up the nuclei of atoms, but the group is much larger than that. Through decades of meticulous study, we now know that there are more than 100 different hadrons. By studying them, physicists have been able to paint a clearer picture of the four fundamental forces that explain our universe.
The periodic table of hadrons
It’s hard out there for a massive particle. Generally, the more massive a particle is, the faster it decays, releasing its energy as it falls apart into less massive particles.
Knowing this, physicists were baffled by the behavior of the some of the new particles they saw. Some of them, such as the kaons, were sticking around much longer than expected, based on their masses.
In 1952, physicists Murray Gell-Mann, Abraham Pais and Kazuhiko Nishijima introduced the concept of ‘strangeness’ to describe this property.
In 1961, Gell-Mann and physicist Yuval Ne’eman discovered that by charting a particle’s strangeness along one axis in a graph and its isospin—a quantum number related to the particle’s interaction with the strong force—along the other, they could group them into precise geometric figures.
Gell-Mann called this organization scheme the “Eightfold Way,” a term he borrowed from the Buddhist path to awakening. It allowed him to create a kind of periodic table for the hadrons.
Just as chemist Dmitri Mendeleev’s periodic table of elements initially contained gaps that allowed him to predict the existence of undiscovered elements, the Eightfold Way also contained gaps that led physicists to the discovery of new particles.
The case for quarks
In developing the Eightfold Way, Gell-Mann had crafted a puzzle with an intricate design. He knew roughly where the missing pieces were, but he couldn’t immediately make sense of the pattern.
Gell-Mann and the physicist George Zweig independently realized that the way in which hadrons were related to one another could be explained if they were actually made up of even smaller particles. Gell-Mann whimsically termed these theoretical elementary particles “quarks,” while Zweig called them “aces.”
But there were two problems with this theory.
First, in their experiments physicists had never detected anything even remotely resembling a quark. Second, and just as dire: Up until that point, the charges of all of the known particles came in whole numbers (i.e. 1, -1, 0). To make the theory of quarks or aces work, they had to have charges that were fractional.
This didn’t sit well with the established physics community, nor even with Gell-Mann himself. He ended his publication outlining the predicted properties of quarks by asking experimentalists to prove him wrong, saying, “A search for stable quarks… would help to reassure us of the non-existence of real quarks.”
Experimentalists obliged this request for several years without luck. But physicists persisted, and a series of experiments performed in the early 1970s at the US Department of Energy’s SLAC National Accelerator Laboratory—then called Stanford Linear Accelerator Center—finally drummed up evidence for the existence of these elementary particles.
Gell-Mann and Zweig were vindicated (though Gell-Mann’s name, “quarks,” won the day), and physicists had a new model for understanding the subatomic realm.
Researchers initially worked under the assumption that there were three quarks, although we now know that there are at least six, called up, down, top, bottom, charm and strange.
Most hadrons are made up of either two or three quarks.
Hadrons made up of three quarks—such as the proton and the neutron—are called baryons. (Protons contain two up quarks and a down quark, while neutrons have two down quarks and an up quark.)
Hadrons made up of two quarks are called mesons. These are bit more exotic; one of their two quarks is always an antimatter particle. Pions, for example, can either be positive, negative or neutral. Positive pions contain an up quark and an anti-down quark that are briefly pulled together in a delicate dance before decaying into a more stable form of matter.
All in all, physicists have either directly detected or otherwise inferred the existence of more than 100 different hadrons, including a few varieties of four- and even five-quark particles.
An unstable partnership
If there are so many different types of hadrons in the universe, why are protons and neutrons the only two that seem to constitute visible matter? To answer this question, we have to return to the question of stability.
Each of the six types of quark has its own mass, ranging from the light up and down quarks, each of which has a mass equal to less than a percent of the proton’s, to the top quark, which is a whopping 175 times more massive than a proton. To put that into perspective, the difference between the mass of the up quark and the top quark is roughly the difference in weight between a tennis ball and an elephant.Illustration by Sandbox Studio, Chicago with Ana Kova
Since protons are made up of extremely small quarks, you might be asking where the proton gets most of its mass. You’re not alone.
“The majority of a hadron’s mass actually comes from the energy of the gluons that bind quarks together,” says Cesar Luis Da Silva, a physicist at Los Alamos National Laboratory. “But exactly how the energy of gluons translates to the mass of hadrons is a question physicists are still trying to answer.”
Hadrons made up of heavier quarks tend to be unstable due to their excess energy and thus exist only briefly before decaying into smaller particles. But the rate at which hadrons decay is governed by which force they interact with.
“Neutral pions decay 300 million times faster than charged pions, even though they have the same mass,” says Da Silva. “That’s because neutral pions decay via the electromagnetic interactions, whereas charged pions decay through the weak force.”
The proton and neutron, made up of the lightest quarks, tend to stick around. But not even those particles are necessarily safe from the ravages of time, points out Dmitri Denisov, the deputy associate lab director for High-Energy Physics at Brookhaven National Laboratory.
“Neutrons in the nucleus of atoms can live for quite a long time—up to billions of years—but as soon as they’re free of the nucleus, they decay in about 15 minutes,” he says.
No one has ever observed a proton decay, but that may only be because they remain relatively stable for such a long time. It’s possible that in the far, distant future, all protons will have decayed into other forms of matter and energy.
As newer particle accelerators harness higher energies than their predecessors, physicists are able to create increasingly exotic particles. This has been a staple for researchers on the LHCb experiment at the Large Hadron Collider, Denisov says.
“The LHC—because it has such high energy—can create particles which contain more than two or three quarks,” he says. “Some can have four, called tetraquarks, or five—the pentaquarks.”
“Like most other hadrons, these particles are unstable and exist for mere billionths of a second,” Denisov says. “There are only a handful of them detected, and some of their properties are puzzling.”
Hadrons are still taking us to the edge of known physics and beyond. Just as the disorienting discovery of new hadrons in cloud chambers led to the theory of quarks, the new tetra- and pentaquarks may lead us to an even deeper understanding of how the universe works.
IceCube at Neutrino 2020
2020-06-29T13:28:34Z via NavierStokesApp To: Public
"IceCube at Neutrino 2020"
The IceCube Collaboration has a robust cohort presenting at the 29th biennial Neutrino conference, the world’s biggest conference in neutrino physics. Three IceCube collaborators will give plenary talks, and there are 30 virtual posters by collaborators.http://icecube.wisc.edu/news/feed )
White Rabbit, a CERN-born technology sets new global standard empowering world innovators
2020-06-29T10:29:02Z via NavierStokesApp To: Public
"White Rabbit, a CERN-born technology sets new global standard empowering world innovators"
White Rabbit, a CERN-born technology sets new global standard empowering world innovators thortala Fri, 06/26/2020 - 16:13http://home.web.cern.ch/about/updates/feed )
White Rabbit (WR) is a technology developed at CERN to provide sub-nanosecond accuracy and picosecond precision of synchronisation for the LHC accelerator chain. First used in 2012, the technology has since then expanded its applications outside the field of particle physics and is now deployed in numerous scientific infrastructures worldwide. It has shown its innovative potential by being commercialised and introduced into different industries, including telecommunications, financial markets, smart grids, space industry and quantum computing.
CERN developed White Rabbit (WR) as an open-source hardware, with primary adoption by other research infrastructures with similar challenges in highly accurate synchronization of distributed electronic devices. The R&D process and all knowledge gained throughout the development has been made available through CERN's Open Hardware Repository. This gives other organisations and companies the freedom to use and modify existing information. Through proactive engagement of CERN's Knowledge Transfer and Beam Controls groups, a larger group of companies and organisations connected to the development of hardware, software, and gateware for WR switches and nodes. The WR ecosystem quickly grew to include several organisations, developing open hardware for widespread benefit. This collaborative approach brought improvements to the original concept, allowing CERN to also benefit from the new developments.
On 16 June, the WR technology was recognised as a worldwide industry-standard, called Precision Time Protocol (PTP), governed by the IEEE, the world's largest technical professional organisation dedicated to advancing technology for the benefit of humanity. The WR addition to the PTP standard, referred to as High Accuracy, allow to increase PTP's synchronisation performance by a few orders of magnitude, from sub-microseconds to sub-nanoseconds.
“PTP is the first IEEE standard to incorporate a CERN-born technology. This is a major step for White Rabbit. It is already widely used in large scientific facilities and its adoption in industry is gaining momentum. Its incorporation into the PTP standard will allow hardware vendors world-wide to produce WR equipment compliant with the PTP standard and consequently accelerate its dissemination on a larger scale," says Maciej Lipinski, Electronics Engineer at CERN, who led the WR standardisation effort.
Week 24 at the Pole
2020-06-26T15:28:54Z via NavierStokesApp To: Public
SuperKEKB collider achieves the world's highest luminosity
2020-06-26T14:28:49Z via NavierStokesApp To: Public
"SuperKEKB collider achieves the world's highest luminosity"
SuperKEKB collider achieves the world's highest luminosityPress Releasexeno Fri, 06/26/2020 - 08:252920
Japan’s High Energy Accelerator Research Organization (KEK) has been steadily improving the performance of its flagship electron-positron collider, SuperKEKB, since it produced its first electron-positron collisions in April 2018. At 20:34 on 15th June 2020, SuperKEKB achieved the world’s highest instantaneous luminosity for a colliding-beam accelerator, setting a record of 2.22×1034cm-2s-1. Previously, the KEKB collider, which was SuperKEKB’s predecessor and was operated by KEK from 1999 to 2010, had achieved the world’s highest luminosity, reaching 2.11×1034cm-2s-1. KEKB’s record was surpassed in 2018, when the LHC proton-proton collider at the European Organization for Nuclear Research (CERN) overtook the KEKB luminosity at 2.14×1034cm-2s-1. SuperKEKB’s recent achievement returns the title of world’s highest luminosity colliding-beam accelerator to KEK.（＊）
（＊）The current record is 2.40×1034cm-2s-1, obtained at 00:53 JST on June 21st.
In the coming years, the luminosity of SuperKEKB will be increased to approximately 40 times the new record. This exceptionally high luminosity is to be achieved mainly by using a beam collision method called the “nano-beam scheme”, developed by Italian physicist Pantaleo Raimondi. Raimondi’s innovation enables significant increases in luminosity by using powerful magnets to squeeze the two beams in both the horizontal and vertical directions. Substantially decreasing the beam sizes increases the luminosity, which varies inversely with the cross-sectional area of the colliding beams.
SuperKEKB is the first collider in the world to realize the nano-beam scheme. In the beam operation of SuperKEKB, we keep increasing the luminosity by squeezing the beams ever harder, while solving various problems associated with the squeezing. Currently, the vertical height of the beams at the collision point is about 220 nanometers, and this will decrease to approximately 50 nanometers (about 1/1000 the width of a human hair) in the future.
Another factor that determines luminosity is the product of the two beam currents, which is proportional to the product of the numbers of electrons and positrons stored in the collider. KEK physicists and accelerator operators continue to increase the beam currents, while mitigating various high-current problems, such as stray background particles that introduce noise in the Belle II detector. SuperKEKB achieved the new luminosity record with a product of beam currents that was less than 25% that of KEKB. This demonstrates the superiority of the SuperKEKB design. In the future, we aim to increase the beam current product to about four times the value achieved by KEKB.
In order to adopt the nano-beam scheme and increase the beam current, KEKB underwent significant upgrades that turned it into SuperKEKB. These included a new beam pipe, new superconducting final-focusing magnets, a positron damping ring, and an advanced injector. The most recent improvement was completed in April 2020, with the introduction of the “crab waist”, first used at the DAΦNE accelerator in Frascati, Italy, in 2010, and which reduces the beam size and stabilizes collisions.
The success of SuperKEKB relies also on contributions from overseas. As an example, the superconducting final-focusing magnets were built in cooperation with Brookhaven National Laboratory and Fermi National Accelerator Laboratory in the U.S. under the U.S.-Japan Science and Technology Cooperation Program. Other major contributions under this program were the development of a collision-point orbit feedback system (SLAC National Accelerator Laboratory) and an X-ray beam size monitor (University of Hawaii and SLAC National Accelerator Laboratory). Researchers from CERN (Switzerland), IJCLab (France), IHEP (China)as well as SLAC(U.S.) have participated in accelerator research and operation under KEK’s Multinational Partnership Project (MNPP-01).There are also contributions from many other foreign research institutes. Other important contributions have come through the Belle II experiment collaboration, such as the diamond-based radiation monitor and beam abort system (INFN and University of Trieste, Italy), and the luminosity monitoring system developed at BINP (Russia).
SuperKEKB brings its electron and positron beams into collision at the center of the Belle II particle detector. The detector has been built and is operated by the Belle II collaboration, an international group of approximately 1,000 physicists and engineers from 119 universities and laboratories located in 26 countries and regions around the world. Belle II physicists use the detector to explore fundamental physics phenomena, by studying the production and decay processes of particles produced in the collisions, primarily B mesons, D mesons, and tau leptons. To within the precision of current measurements, the behavior of particles such as these is well described by the theory known as the Standard Model. However, the Standard Model fails to address key questions, such as the mystery of the matter-dominated universe and the existence of dark matter. Therefore, new physical laws are needed to explain these observations. Signals of such “new physics” may arise in decay processes that are very rarely observed. Maximizing the discovery potential of Belle II for such signals requires a large number of electron-positron collisions, necessitating a very high-luminosity collider, such as SuperKEKB.
Collecting data for about 10 years, the Belle II experiment will accumulate 50 times more particle collisions than its predecessor, the Belle experiment. The large data set, containing about 50 billion B-meson pairs and similar numbers of charm mesons and tau leptons, will enable Belle II physicists to explore nature at a much deeper level than was previously possible. The data will also be used in sensitive searches for very weakly interacting particles that may help answer some of the outstanding mysteries of the universe.
Media Contacthttp://www.interactions.org/index.rss )
High Energy Accelerator Research Organization (KEK)
KEK was established in 1997 in a reorganization of the Institute of Nuclear Study, University of Tokyo (established in 1955), the National Laboratory for High Energy Physics (established in 1971), and the Meson Science Laboratory of the University of Tokyo (established in 1988).
Scientists at KEK use accelerators and perform research in high-energy physics to answer the most basic questions about the universe as a whole, and the matter and the life it contains.
+ 81 029-879-6047,
+ 81 029-879-6049 (fax)http://www.kek.jp/en/
High Energy Accelerator Research Organization (KEK)
Public Relation Office, Head
Stephen Sekula likes this.
Stephen Sekula shared this.
Still no sterile neutrinos, according to new IceCube analysis
2020-06-25T21:29:05Z via NavierStokesApp To: Public
"Still no sterile neutrinos, according to new IceCube analysis"
In two new papers, the IceCube Collaboration updates their eV-scale sterile neutrino search using an eight-year dataset and improved event selection. The analysis found no evidence of sterile neutrinos at this energy scale and was consistent with the no-sterile-neutrino hypothesis.http://icecube.wisc.edu/news/feed )
Stopping the spread of COVID-19, and arctic adaptations in sled dogs
2020-06-25T18:29:02Z via NavierStokesApp To: Public
"Stopping the spread of COVID-19, and arctic adaptations in sled dogs"
Kimberly Prather, an atmospheric chemist at the University of California, San Diego, who studies how ocean waves disperse virus-laden aerosols, joins host Sarah Crespi to talk about how she became an outspoken advocate for using masks to prevent coronavirus transmission. A related insight she wrote for Science has been downloaded more than 1 million times. Read Science’s coronavirus coverage. Mikkel Sinding, a postdoctoral fellow at Trinity College Dublin, talks sled dog genes with Sarah. After comparing the genomes of modern dogs, Greenland sled dogs, and an ancient dog jaw bone found on a remote Siberian island where dogs may have pulled sleds some 9500 years ago, they found that modern Greenland dogs—which are still used to pull sleds today—have much in common with this ancient Siberian ancestor. Those similarities include genes related to eating high-fat diets and cold-sensing genes previously identified in woolly mammoths. In this month’s book segment, Kiki Sanford talks with Rutger Bregman about his book, Humankind: A Hopeful History, which outlines a shift in the thinking of many social scientists to a view of humans as more peaceful than warlike. This week’s episode was produced with help from Podigy. Listen to previous podcasts. About the Science Podcast Download a transcript (PDF).http://www.sciencemag.org/rss/podcast.xml )
DUNE moves to the next stage with a blast
2020-06-24T21:29:00Z via NavierStokesApp To: Public
"DUNE moves to the next stage with a blast"
Construction workers have carried out the first underground blasting for the Long-Baseline Neutrino Facility, which will provide the space, infrastructure and particle beam for the international Deep Underground Neutrino Experiment.
It started with a blast.
On June 23, construction company Kiewit Alberici Joint Venture set off explosives 3,650 feet beneath the surface in Lead, South Dakota, to begin creating space for the international Deep Underground Neutrino Experiment, hosted by the Department of Energy’s Fermilab.
The blast is the start of underground excavation activity for the experiment, known as DUNE, and the infrastructure that powers and houses it, called the Long-Baseline Neutrino Facility, or LBNF.
Situated a mile deep in South Dakota rock at the Sanford Underground Research Facility, DUNE’s giant particle detector will track the behavior of fleeting particles called neutrinos. The plan for the next three years, is that workers will blast and drill to remove 800,000 tons of rock to make a home for the gigantic detector and its support systems.
“The start of underground blasting for these early excavation activities marks not only the initiation of the next major phase of this work, but significant progress on the construction already under way to prepare the site for the experiment,” says Fermilab Deputy Director for LBNF/DUNE-US Chris Mossey.
The excavation work begins with removing 3,000 tons of rock 3,650 feet below ground. This initial step carves out a station for a massive drill whose bore is as wide as a car is long, about four meters.
The machine will help create a 1,200-foot ventilation shaft down to what will be the much larger cavern for the DUNE particle detector and associated infrastructure. There, 4,850 feet below the surface—about 1.5 kilometers deep—the LBNF project will remove hundreds of thousands of tons of rock, roughly the weight of eight aircraft carriers.
The emptied space will eventually be filled with DUNE’s enormous and sophisticated detector, a neutrino hunter looking for interactions from one of the universe’s most elusive particles. Researchers will send an intense beam of neutrinos from Fermilab in Illinois to the underground detector in South Dakota—straight through the earth, no tunnel necessary—and measure how the particles change their identities. What they learn may answer one of the biggest questions in physics: Why does matter exist instead of nothing at all?
“The worldwide particle physics community is preparing in various ways for the day DUNE comes online, and this week, we take the material step of excavating rock to support the detector,” says DUNE co-spokesperson Stefan Söldner-Rembold of the University of Manchester. “It’s a wonderful example of collaboration: While excavation takes place in South Dakota, DUNE partners around the globe are designing and building the parts for the DUNE detector.”
A number of science experiments already take data at Sanford Underground Research Facility, but no activity takes place at the 3650 level. With nothing and no one in the vicinity, the initial excavation stage to create the cavern for the drill proceeds in an isolated environment. It’s also an opportunity for the LBNF construction project to gather information about matters such as air flow and the rock’s particular response to the drill-and-blast technique before moving on to the larger excavation at the 4850 level, where the experiment will be built.
“It was important for us to develop a plan that would allow the LBNF excavation to go forward without disrupting the experiments already going on in other parts of the 4850 level,” says Fermilab Long-Baseline Neutrino Facility Far-Site Conventional Facilities Manager Joshua Willhite. Following a period of excavation at the 3650 level, the project will initiate excavation at the 4850 level.
Every bit of the 800,000 tons of rock dislodged by the underground drill-and-blast operation must eventually be transported a mile back up to the surface. There, a conveyor is being built to transport the crushed rock over a stretch of 4,200 feet for final deposit in the Open Cut, an enormous open pit mining area excavated in the 1980s. As large as the LBNF excavation will be, the rock moved to the surface and deposited in the Open Cut will only fill less than one percent of it.
Excavation at the 3650 level will be completed over the next few months, with blasting at the 4850 level planned to begin immediately after.
Editor's note: A version of this article was originally published by Fermilab.
Stephen Sekula likes this.
Stephen Sekula shared this.
Unraveling the processes that power the sun
2020-06-24T19:28:49Z via NavierStokesApp To: Public
"Unraveling the processes that power the sun"
The Borexino experiment announces the first detection of neutrinos from a secondary cycle that fuels our closest star.
The sun is powered by nuclear fusion, a process through which hydrogen is converted into helium, emitting large amounts of energy.
A side effect of this process is the generation of neutrinos, ghostly fundamental particles that rarely interact with their surroundings. The sun is the source of the majority of the Earth’s neutrinos, sending trillions of these particles raining down on our planet each day.
Solar neutrinos have provided physicists with an invaluable tool to study how the sun works, and what it is made of. Using these elusive particles, scientists have been able to confirm that more than 99 percent of fusion reactions in the sun are proton-proton chain reactions, which use hydrogen as the main source of fuel.
At this week’s Neutrino 2020 meeting, physicists on the Borexino neutrino experiment, located at the Gran Sasso Laboratory in Italy, have announced the first-ever detection of neutrinos from another, less common fusion process: the carbon-nitrogen-oxygen (CNO) cycle, which uses carbon, nitrogen and oxygen as catalysts to fuel the conversion of hydrogen to helium.
“With these results, Borexino marks the first detection ever of CNO solar neutrinos,” Borexino spokesperson Gioacchino Ranucci of INFN Milano said during the Neutrino 2020 presentation. “We have completely unraveled the two processes powering the sun.”
Theorists have long suspected the existence of the CNO cycle. If this process didn’t occur, that would mean that the amount of these three elements in the sun is much lower than expected, says André de Gouvêa, a theoretical physicist and professor at Northwestern University. “We know that they’re supposed to be there unless we’re missing something big.”
In order to detect CNO neutrinos, physicists needed a highly sensitive detector capable of blocking out most sources of background noise.
To achieve the sensitivity, the Borexino detector was built with an onion-like design. It has multiple layers: The transparent, spherical core is filled with 280 tons of liquid scintillator (a material that emits light when a neutrino interacts with the electrons within it), which is encased in a large, stainless-steel sphere filled with a buffer liquid and studded with 2200 light sensors. The outer sphere is held within an even larger stainless-steel tank filled with 2400 tons of ultrapure water. The entire detector is buried 1.4 kilometers (about 0.9 miles) underground.
Borexino has been taking data since 2007. The collaboration made headlines around the world in 2014 when it announced the first real-time detection of neutrinos from proton-proton chain reactions.
Identifying the CNO cycle neutrinos was no simple feat. Even with a highly-sensitive detector, Borexino physicists needed to remove a key source of background: the decay of the isotope Bismuth-210. This was achieved by studying the decay rate of Polonium-210, a process that is in equilibrium with Bismuth-210 decay but easier to measure.
As Ranucci described during the Neutrino 2020 presentation, the quest for CNO neutrinos “turned into the quest for Bismuth-210 through Polonium-210.”
However, since the decay rate of Polonium-210 is highly sensitive to fluctuations in temperature, the team needed to carefully monitor, understand and suppress the temperature variation within the hall that houses the detector. “This is the outcome of the relentless, years-long effort to stabilize the detector and understand the [Polonium-210] behavior in the inner vessel,” Ranucci said.
CNO neutrinos are particularly interesting to solar physicists because they provide one of the most direct measures of the sun’s metallicity—the content of “metals,” which are what astrophycists call elements other than hydrogen and helium. Studying the properties of CNO neutrinos can help physicists disentangle whether the sun skews towards heavier metallicity, meaning higher metal content, or lighter metallicity, meaning lower metal content.
Although physicists cannot draw firm conclusions about solar metallicity with the latest Borexino result, the collaboration plans to further constrain their measurements in hopes of providing insights in the future.
Finding out the composition of the sun—and how it works—can help scientists understand how other, similar stars work as well. “There’s a whole range of what the stars can look like, depending on how heavy they are, how old they are, and the mechanism through which they were born,” de Gouvêa says. “By learning as much as we can about the sun, we can take that information and we also apply to other stars for which we have more limited knowledge.”
Electricity transmission reaches even higher intensities
2020-06-24T14:29:01Z via NavierStokesApp To: Public
"Electricity transmission reaches even higher intensities"
Electricity transmission reaches even higher intensitiescmenard Wed, 06/24/2020 - 15:26http://home.web.cern.ch/about/updates/feed )
Intensity is rising at CERN. In the superconducting equipment testing hall, an innovative transmission line has set a new record for the transport of electricity. The link, which is 60 metres long, has transported a total of 54 000 amperes (54 kA, or 27 kA in either direction). “It is the most powerful electrical transmission line built and operated to date!” says Amalia Ballarino, the designer and project leader.
The line has been developed for the High-Luminosity LHC (HL-LHC), the accelerator that will succeed the Large Hadron Collider (LHC) and is scheduled to start up at the end of 2027. Links like this one will connect the HL-LHC’s magnets to the power converters that supply them.
Graphene’s potential to improve magnetic measurements for accelerators
2020-06-23T14:29:48Z via NavierStokesApp To: Public
"Graphene’s potential to improve magnetic measurements for accelerators"
Graphene’s potential to improve magnetic measurements for accelerators thortala Tue, 06/23/2020 - 16:02http://home.web.cern.ch/about/updates/feed )
CERN and Paragraf - a technology company borne out of the department of materials science at the University of Cambridge – are set to detail final results of tests conducted on a novel graphene-based local magnetic measurement sensor. The collaboration has proved that such a sensor eliminates some of the systematic errors and inaccuracies found in the state-of-the-art sensors used at CERN.
The Hall probe is an important tool for local magnetic field mapping – an essential task in particle accelerators, which depend on high-precision magnetic fields. The probe transduces the magnetic field into a proportional voltage. However, errors frequently arise due to elements of the sensor not being perfectly aligned and sensitive to in-plane field components (planar effect), as well as non-linear response.
Theoretically, graphene solves this issue. This carbon allotrope, first discovered at the University of Manchester in 2004, has been hailed as the new wonder material, as its extreme thinness, lightness, conductivity and resistance could revolutionize a variety of technologies. In the case of the Hall probe, the development of a two-dimensional graphene sensor clears the issue of planar effects and makes for precise detections, including at liquid-helium temperatures.
Find out more in the CERN Courier
A CERN-led international collaboration develops 3D-printed neutrino detectors
2020-06-23T13:29:38Z via NavierStokesApp To: Public
"A CERN-led international collaboration develops 3D-printed neutrino detectors"
A CERN-led international collaboration develops 3D-printed neutrino detectorsthortala Mon, 06/22/2020 - 13:52http://home.web.cern.ch/about/updates/feed )
Plastic scintillators are one of the most used active materials in high-energy physics. Their properties make it possible to track and distinguish between particle topologies. Among other things, scintillators are used in the detectors of neutrino oscillation experiments, where they reconstruct the final state of the neutrino interaction. Measurements of oscillation phenomena are carried out through comparison of observations of neutrinos in near detectors (close to the target) and far detectors (up to several hundred kilometres away).
CERN is strongly involved in the T2K experiment, the current world-leading neutrino oscillation experiment, in Japan, which recently released promising results. A future upgrade of the experiment’s near detector will pave the way for more precise results. The novel detector will comprise a two-tonne polystyrene-based plastic scintillator detector segmented into 1 x 1 x 1 cm3 cubes, leading to a total of around two million sensitive elements: the smaller the cubes, the more precise the results. This technology could be adopted for other projects, such as the DUNE near detector. However, more precise measurements would require finer granularity, making the detector assembly harder.
This is where the CERN EP-Neutrino group – led by Albert De Roeck – steps in, developing a new plastic scintillator production technique that involves additive manufacturing. The R&D is carried out in collaboration with the Institute for Scintillation Materials (ISMA) of the National Academy of Science of Ukraine, which has strong expertise in the development of scintillator materials, and the Haute École d’Ingénierie et Gestion du Canton de Vaud (HEIG-VD), which is expert in additive manufacturing. The final goal is to 3D-print a “super-cube”, that is, a single massive block of scintillator containing many optically independent cubes. 3D-printing would solve the issue of assembling the individual cubes, which could thus be produced in any size, including smaller than 1 cm3, and relatively quickly (volumes bigger than 20 x 20 x 20 cm3 can be produced in about a day).
So far, the collaboration has been fruitful. A preliminary test gave the first proof of concept: the scintillation light yield of a polystyrene-based scintillator 3D-printed with fused deposition modelling (see fig. 2) has been found to be comparable to that of a traditional scintillator. But the road towards a ready-to-use super-cube is still long. Further optimisation of the scintillator parameters and tuning of the 3D-printer configuration, followed by a full characterisation of the 3D-printed scintillator, will need to be achieved before the light reflector material for optically isolating the cubes can be developed.
This new technique could also open up new possibilities for the field of particle detection. A successful 3D-printed plastic scintillator detector could pave the way for a broader use of this technology in detector building, which could shake up the field of high-energy physics, as well as that of medicine, where particle detectors are used, for instance, in cancer therapy. Moreover, the greatly cost-effective 3D-printer could be replicated quite easily and used in a vast number of settings. Umut Kose, from the EP-neutrino group and Neutrino Platform at CERN, explains: “Our dream goes beyond the super-cube. We like to think that, in a few years, 3D-printing will allow high-school students to make their own radiation detection systems. The outreach potential of this technology is mind-blowing”.
Davide Sgalaberna, now at ETH Zurich, cannot hide his enthusiasm for this adventure: “This is the first time that 3D-printing could be used for real particle detectors. We are transforming our personal will into a project, and we are hopeful that this could lead to a breakthrough. That is thrilling”. A thrill shared by Davide’s colleagues, who are more than ready to resume work on the 3D-printed detector once the easing of lockdown allows everyone to return to CERN.
Read the full story in the EP newsletter
NA64 explores gap in searches for axions and axion-like particles
2020-06-22T14:30:15Z via NavierStokesApp To: Public
"NA64 explores gap in searches for axions and axion-like particles"
NA64 explores gap in searches for axions and axion-like particlesabelchio Mon, 06/22/2020 - 15:44http://home.web.cern.ch/about/updates/feed )
There is strong evidence that dark matter exists and permeates the cosmos, yet all searches for the hypothetical particles that may make up this invisible form of matter have drawn a blank so far. In light of these null results, researchers have started to spread a wider net in their searches, exploring as many types of particle as possible, new regions in which the particles may lie hidden and new ways to probe them. The NA64 experiment collaboration has now widened the scope of its searches with a search for axions and axion-like particles – hypothetical particles that could mediate an interaction between dark matter and visible matter or comprise dark matter itself, depending on their exact properties.
The NA64 team targeted an unexplored area for axions and axion-like particles, a gap in the two-dimensional area of possible values of their mass and interaction strength with a pair of photons. This gap doesn’t include the regions where axions and axion-like particles could make up dark matter, but it includes an area where axions could explain the long-puzzling symmetry properties of the strong force, for which axions were originally proposed, as well as an area where axion-like particles could mediate an interaction between dark matter and visible matter.
To explore this gap, the NA64 team used an electron beam of 100 GeV energy from the Super Proton Synchrotron and directed it onto a fixed target. They then searched for axions and axion-like particles that would be produced in interactions between high-energy photons generated by the 100 GeV electrons in the target and virtual photons from the target’s atomic nuclei. The researchers looked for the particles both through their transformation, or “decay”, into a pair or photons in a detector placed right after the target or through the “missing energy” that the particles would carry away if they decayed downstream of the detector.
The NA64 team analysed data that was collected over the course of three years, between 2016 and 2018. Together, these data corresponded to some three hundred billion electrons hitting the target. The NA64 researchers found no sign of axions and axion-like particles in this dataset, but the null result allowed them to set limits on the allowed values of the interaction strength of axions and axion-like particles with two photons for particle masses below 55 MeV.
“We’re very excited to have added NA64 to the list of experiments that are hunting for axions as well as axion-like particles, which are a popular candidate for a mediator of a new force between visible and dark matter”, says NA64 collaboration spokesperson Sergei Gninenko. “Little by little, and together, these experiments are narrowing down the regions of where to look for, and perhaps find, these particles.”
European Strategy prioritizes Higgs factory
2020-06-22T13:30:03Z via NavierStokesApp To: Public
"European Strategy prioritizes Higgs factory"
The 2020 European Strategy recommends pursuing a Higgs factory, investigating a next-generation hadron collider at CERN, and ramping up accelerator technology R&D.
The successor to the Large Hadron Collider in Europe should be a machine specifically designed for precision studies of the Higgs boson, according to the CERN Council and two representative bodies commissioned by the Council as part of the 2020 Update of the European Strategy for Particle Physics.
The updated strategy for particle physics in Europe [see PDF] recommends pursuing an electron-positron Higgs factory as the highest-priority facility after the LHC. The LHC is currently undergoing a major upgrade and is scheduled to continue running as the High-Luminosity LHC until late next decade.
The strategy emphasizes the importance of ramping up research and development for advanced accelerator, detector, and computing technologies to prepare for all future collider facilities, including the Future Circular Collider, or FCC-hh. The strategy calls for Europe, together with its international partners, to investigate the technical and financial feasibility of the proposed circular hadron collider with an electron-positron Higgs factory as a possible first stage.
LHC scientists discovered the Higgs boson in 2012 and, among other things, are currently studying its properties to search for signs of physics beyond the Standard Model.
“The Higgs is a very unique particle that raises profound questions about the fundamental laws of nature,” said Halina Abramowicz, the chairperson of the European Strategy update committee, during the open CERN Council session. “That led us to the electron-positron collider as a Higgs factory.”
Hadron colliders such as the LHC are especially prized for the high energies they achieve and their associated potential to discover new particles and forces. Electron-positron colliders, which collide point-like particles that annihilate one another on contact, provide cleaner and easier-to-parse results. An electron-positron collider would allow scientists to study the properties of the Higgs with even greater precision, opening the door to discovering discrepancies between theory and experiment that could reveal new physics.
A possible first stage for the FCC-hh at CERN would be the FCC-ee, an electron-positron collider. The FCC-ee would collide electrons and positrons in a 100-kilometer (62-mile) circular tunnel passing under Lac Leman at the border of France and Switzerland. The FCC-ee tunnel would provide a ready-made home for the FCC-hh.
The updated strategy encourages further investigation of the feasibility of each collider configuration, “such that we are in the position to decide on the project during the next strategy update,” in five to seven years, according to CERN Council President Ursula Bassler.
The strategy highlights the long-term vision of a 100-TeV circular hadron collider similar to the LHC but up to seven times more powerful. This collider would enable scientists to look for extremely heavy particles that could help answer outstanding questions in physics, such as the identity of the mysterious dark matter or the origin of gravity.
“These are very exciting times, very dynamic times, but also very frustrating,” Abramowicz said in a Q&A after the announcement. “Many people are really scratching their heads.”
Both the electron-positron Higgs factory and the 100-TeV collider would be global endeavors that will incorporate knowledge and contributions from institutions around the world. “This is a strategy for Europe but within a global context,” CERN Director-General Fabiola Gianotti said. “I’m excited to be part of this adventure implementing the strategy and working with our colleagues from all over the world.”
Based on the strong endorsement of the need for a Higgs factory, the strategy also encourages the timely realization of the electron-positron International Linear Collider in Japan, which would be compatible with Europe’s own plans for a future facility at CERN. In such a case, the strategy expresses a willingness of the European particle physics community to collaborate in the ILC.
In addition to highlighting R&D on future colliders, the strategy reaffirms the importance of the ongoing work for the High-Luminosity LHC accelerator and detector upgrades and continued commitment to participate in long-baseline neutrino research in the United States and Japan. The strategy suggests, “In particular, [Europe and CERN] should continue to collaborate with the United States and other international partners towards the successful implementation of the Long-Baseline Neutrino Facility (LBNF) and the Deep Underground Neutrino Experiment (DUNE).”
LBNF/DUNE is hosted by the US Department of Energy’s Fermi National Accelerator Laboratory in Illinois, with CERN and several European nations as major partners.
DOE’s Office of Science has partnered with CERN on R&D efforts to develop next-generation accelerators for the past two decades. The updated European Strategy is consistent with DOE’s own program planning.
The strategy is also designed to be compatible with program planning for other fields, offering support for complementary projects in other related disciplines.
“In science, there is a tendency that if you want to make progress and reach higher precision, the equipment grows,” Bassler said. “As the equipment grows, we come together to support each other.”