UC’s international co-op program receives global acclaim
October 11, 2019
Unable to get image from page properties or content. Will fall back to a default image.
Article has no nextliveshere tags assigned
Article has no topics tags assigned
Article has no colleges tags assigned
Article has no audiences tags assigned
Article has no units tags assigned
Contacts are empty
These messages will display in edit mode only.
The National Science Foundation on Tuesday awarded $25 million to the University of Cincinnati and 16 other research institutions to study high-energy physics.
The goal is to develop the computer tools needed to capture and parse huge bundles of data from particle collisions in the Large Hadron Collider in the Swiss lab CERN, which first observed the Higgs boson particle in 2012.
“We want algorithms that select the most interesting events with high efficiency and reject the rest,” UC physics professor Michael Sokoloff said.
Sokoloff has spent his career studying particle physics at accelerators such as Fermilab in Chicago and the SLAC National Accelerator Lab in California. He is the principal investigator for UC. Princeton University is leading the project.
By 2026, CERN plans to increase tenfold the number of particle collisions it can produce in its collider — a concept called luminosity. But to capture so much data — measured in exabytes, or a million terabytes — scientists will need more powerful computing tools. To that end, the National Science Foundation is opening a new institute for research and innovation in software for high-energy physics.
“Even now, physicists just can't store everything that the (collider) produces,” said Bogdan Mihaila, the NSF program officer overseeing the grant. “We have to get smarter and step up our game. That is what the new software institute is about.”
We are now searching for the next layer of physics beyond the Standard Model.
Peter Elmer Physicist
The new institute represents a coalition of 17 research universities that will develop computer solutions to study data from the Large Hadron Collider’s four detectors. By 2026, the collider will generate as many as 1 billion proton collisions every second. New computer tools will help physicists capture the collisions (called events), identify the most relevant and study the results efficiently.
Efficiency is relative in particle physics. Data captured in previous experiments has taken months to sort and analyze, Sokoloff said.
“We want to use new computing architectures more effectively. And we’d like to develop qualitatively new algorithms,” Sokoloff said.
Among other questions, Sokoloff and other physicists at CERN are searching for an explanation for why the universe is full of matter but not antimatter, which was created in equal parts during the Big Bang.
"High-energy physics had a rush of discoveries and advancements in the 1960s and 1970s that led to the Standard Model of particle physics, and the Higgs boson was the last missing piece of that puzzle," Princeton University physicist Peter Elmer said. "We are now searching for the next layer of physics beyond the Standard Model.”
Sokoloff, a professor in UC’s McMicken College of Arts and Sciences, is collaborating with Gowtham Atluri, an assistant professor of computer science in UC’s College of Engineering and Applied Science. Atluri’s research is in the area of data science with a focus on developing computational algorithms that will accelerate the pace of scientific discovery.
Atluri said computer scientists will develop new ways to help physicists discover rare and previously unknown events, the proverbial needle in the haystack. Any breakthroughs are likely to help scientists in diverse fields such as neuroscience, where brain scans also generate huge volumes of data, Atluri said.
The NSF will award collaborators $5 million per year for five years, to develop new software and train the next generation of users. UC will pursue innovative algorithms for reconstructing data in real time and selecting the subset to be recorded off-line. UC researchers also will develop tools needed for the analysis.
The project is just the latest NSF contribution to the 40-nation effort at CERN, which spawned contributions to society such as the world wide web in 1989. Similarly, advancements from the project will be shared publicly through its open-source code.
Ambitious projects that transcend conventional academic boundaries can inspire a new generation of students to embrace interdisciplinary research, UC’s Atluri said.