As cosmologists ponder the universe—and other possible universes—the data available to them is so complex and vast that it can be extremely challenging for humans alone to comprehend.
In applying scientific principles used to create models for understanding cell biology and physics to the challenges of cosmology and big data, Cornell researchers have developed a promising algorithm to map a multifaceted set of probabilities.
The new method, which researchers have used to visualize models of the universe, could help solve some of physics’ greatest mysteries, such as the nature of dark energy or the likely characteristics of other universes.
“Science works because things behave much more simply than they have any right to,” said James Sethna, professor of physics and senior author of “Visualizing Probabilistic Models With Intensive Principal Component Analysis,” which published online June 24 in the Proceedings of the National Academy of Sciences. “Very complicated things end up doing rather simple collective behavior.”
That, he said, is because not every factor in a system is significant. For example, millions of atoms may be involved in a physical collision, but their behavior is determined by a relatively small number of constants. Data about the universe collected by powerful telescopes, however, has so many parameters it can be challenging for researchers to figure out which measurements are most important to reveal insights.