Thomas Huffman’s Up Golf  is out now, and it’s a fun game worth downloading. This is a 2D vertical golfing game in the vein of Super Stickman Golf, where you’re trying to instead constantly ascend, getting a point for each hole you get your character-turned-golf-ball into. You can skip holes, but I don’t recommend it.
A major part of the challenge comes from the screen that constantly scrolls upward. So, it’s possible for one ill-considered shot to send you into the abyss, ending your run of golfing. Water and lava traps will also befall you, usually with side entrances, so you have to take care around these in particular. The courses do contain some interesting wrinkles: portals that warp you upward, and springs that can help you if you hit them just right.
You can unlock new characters and landscapes by collecting coins, or can just buy them outright. The landscapes include entirely different themes, such as a vaporwave landscape. Finally, golf meets vaporwave. Beautiful.
If you like 2D golfing games, you’ll likely have a good time with Up Golf. The controls are smooth, with the ability to pull and aim from anywhere on the screen, so you’re never blocking your view of your character. The game also looks great on high-resolution phones. The game is incredibly challenging to get even a double-digit score because there just isn’t much of a margin for error. You have to aim well and be wary of bad bounces, but there is the timing factor to consider with the screen continually rising! You have to get good at making accurate shots, but also quick ones, and it makes for a tense challenge the further upward that you ascend!
Give Up Golf a shot, it’s a well-made endless 2D golfer that I’m having a lot of fun playing. It’s out now on the App Store and on Google Play…you know, if you’ve gone over to the dark side.
There was a time in the App Store when you couldn’t avoid seeing a physics puzzler at every turn, which is fine and all, but once a genre gets overly-saturated like that it’s hard to get excited for each new release. Well like many of the most overdone genres on mobile, there also comes a point where enough time has passed and you’re actually in a mood for one of those games again. That’s how I felt when I saw the trailer for Polar Rollout from developer Blue Evolution. Your job here is to draw a path using the touchscreen that your character can roll along destroying any enemies in its path and making it to the goal. Here’s a trailer.
Polar Rollout will include 120 levels across 6 different worlds and will come with multiple playable characters to unlock and a host of unique special abilities. It looks cute and well-designed, and like I said before I’m actually in the mood for a new physics puzzler at this point in my mobile gaming life. If you feel the same then you can actually pre-order Polar Rollout ahead of its planned April 18th release date. Just click this link on an iOS 11 device and be magically whisked away to the App Store where you can pre-order and then the game will similarly magically automatically download to your device when it arrives. For more you can drop by the forum thread which has been chugging along for more than two years as the game has been in development.
As far as astronomers can tell, the universe is continuing to expand — and our understanding of how it is doing this needs to expand as well. In fact, recent findings from researchers partnering with NASA suggest that we may need to discover new physics to explain discrepancies between measurements of universal expansion.
“The community is really grappling with understanding the meaning of this discrepancy,” Nobel Laureate Adam Riess of the Space Telescope Science Institute (STScI) and Johns Hopkins University, told NASA. Riess was one of the researchers who studied Cepheid stars and Type Ia supernovae to measure the universe’s expansion, and they found it to be faster than what we thought.
Precision is Key
Riess and colleagues, including Stefano Casertano who is also from SRScI and Johns Hopkins, have been giving these stars another look using NASA’s Hubble to gather data on Cepheids that are some 6,000 to 12,000 light-years away. What they did was measure the how the positions of these stars changed with the Earth’s every rotation around the Sun.
“You’re measuring the separation between two stars, not just in one place on the camera, but over and over thousands of times, reducing the errors in measurement,” Reiss said said in the NASA press release.
While their latest findings, which have been accepted for publication in The Astrophysical Journal, suggest a different measurement for Hubble’s Constant that’s supposedly more precise because of the instruments they have been using. They’re not stopping there, though.
The team plans to improve their measurements using data from Hubble and the European Space Agency’s Gaia space observatory. “This precision is what it will take to diagnose the cause of this discrepancy,” Casertano said in the NASA press release.
Reiss and his team, however, are still uncertain why there have been discrepancies in the first place, which could potentially show that there is something going on in the universe that could even require new physics in order for us to understand.
It suggests that each one of the multiple possible outcomes of a quantum event splinters off into its own discrete world.
Now, a team of researchers at the California Institute of Technology led by Sean Carroll has suggested that this interpretation can explain away inconsistencies pertaining to black holes. They say that general relativity is upheld within each single possible world, while information is preserved across the entire global wave function, if not among individual branches.
Aidan Chatwin-Davies, a member of Carroll’s team, told Futurism that other scientists have already suggested applying the many worlds theory, also known as Everettian, to the black hole information problem. “Cosmetically, we’re perhaps the first to cleanly label our perspective as Everettian,” he said. “More substantively, we wanted to perform some concrete calculations to mathematize otherwise abstract ideas.”
“Previous attempts considered statements of general relativity and quantum mechanics to be applicable to the same world,” Yasunori Nomura, professor of physics at the University of California, Berkeley told Futurism. “My approach separates the two – quantum mechanics allows for a quantum state to be a ‘superposition’ of many classical worlds; statements of quantum mechanics apply to the entirety of these many worlds while those of general relativity apply only to each of these worlds.”
This line of thinking is important because it could potentially explain more about the nature of gravity and spacetime. Nomura suggests that these ideas have a broader relevance to how quantum gravity works at a fundamental level, particular in relation to the origins of the universe.
“We know that we need both general relativity and quantum mechanics to understand black holes, and so they are a good starting point for testing out ideas about quantum gravity,” explained Chatwin-Davies. “If we really understood how to describe black holes, then we would be a great deal closer to being able to describe quantum gravity in broad generality.”
By using the many worlds interpretation, scientists and astronomers are finding new ways to approach longstanding questions about black holes. With further study, this research might offer up further information about the very fabric of our universe that could fill in some persistent gaps in our knowledge.
Scientists have long been intrigued by the physics near absolute zero — the temperature 0° Kelvin, or -273.15°C, where particles reach the lowest possible amount of movement — ever since this limit was theorized. Yet reaching absolute zero has been called impossible: as you continue to remove heat from a gas to cool it, the work needed to remove the heat increases. At this temperature, the work to remove additional heat becomes infinite.
This doesn’t discourage scientists from trying, however. A group at the University of Basel recently have developed a device that gets us closer than ever before to the coldest of cold, and could allow them to explore the strange physics that is thought to occur near absolute zero.
The team developed a nanoelectronic chip that can cool to a record 2.8 millikelvin. The device uses magnetic cooling, through an applied magnetic field, to diminish the chip’s electrical connections to 150 microkelvin. Using a designed magnetic field system, the team also cooled a thermometer in order to measure temperature, and kept the chip cold for seven hours. This extended period of time allowed the team to explore this ultra-cooled state.
While the chip itself is impressive, it’s not even the most exciting part of this experiment. This chip opens up an enormous potential to better understand what happens to physics near absolute zero. This understanding could continue to increase, as the researchers are hoping to improve the device and their experiment to eventually reach a bone-chilling 1 millikelvin.
Within the past year, the observable limit (also known as the quantum backaction limit) of how low you can theoretically cool an object has been experimentally challenged. Researchers were able to cool an object to less than one-fifth of one quantum, which is below the “quantum limit.” But later on, scientists officially proved the third law of thermodynamics by showing that it is mathematically impossible to cool to absolute zero.
The strange physical space on the cusp of reaching absolute zero is relatively unexplored, as it has been so difficult to reach experimentally. With our newfound ability to research this uber-cold state, there is still much to learn about physics at these temperatures.
Additionally, understanding absolute zero could potentially improve modern electronics. The performance of transistors is greatly affected by temperature. Traditional transistors run into a whole host of issues when they overheat, which happens commonly. The transistors used in computers, smart devices, and other commercially available electronic devices are shown to be much more efficient at extremely low temperatures, like those barely above absolute zero.
The use of extremely low temperatures could improve not only household electronics, but also support the technologies we use to explore the far reaches of the cosmos. Infrared cameras built for space imaging need to operate at the lowest possible temperatures, which allow them to operate with maximum sensitivity. These temperatures could also be integral in advancing medical imaging technologies.
Recent measurements from the Large Hadron Collider show a discrepancy with Standard Model predictions that may hint at entirely new realms of the universe underlying what’s described by quantum physics. Although repeated tests are required to confirm these anomalies, a confirmation would signify a turning point in our most fundamental description of particle physics to date.
Quantum physicists found in a recent study that mesons don’t decay into kaon and muon particles often enough, according to the Standard Model predictions of frequency. The authors agree that enhancing the power of the Large Hadron Collider (LHC) will reveal a new kind of particle responsible for this discrepancy. Although errors in data or theory may have caused the discrepancy, instead of a new particle, an improved LHC would prove a boon for several projects on the cutting edge of physics.
The Standard Model
The Standard Model is a well-established fundamental theory of quantum physics that describes three of the four fundamental forces believed to govern our physical reality. Quantum particles occur in two basic types, quarks and leptons. Quarks bind together in different combinations to build particles like protons and neutrons. We’re familiar with protons, neutrons, and electrons because they’re the building blocks of atoms.
The “lepton family” features heavier versions of the electron — like the muon — and the quarks can coalesce into hundreds of other composite particles. Two of these, the Bottom and Kaon mesons, were culprits in this quantum mystery. The Bottom meson (B) decays to a Kaon meson (K) accompanied by a muon (mu-) and anti-muon (mu+) particle.
They found a 2.5 sigma variance, or 1 in 80 probability, “which means that, in the absence of unexpected effects, i.e. new physics, a distribution more deviant than observed would be produced about 1.25 percent of the time,” Professor Spencer Klein, senior scientist at Lawrence Berkeley National Laboratory, told Futurism. Klein was not involved in the study.
This means the frequency of mesons decaying into strange quarks during the LHC proton-collision tests fell a little below the expected frequency. “The tension here is that, with a 2.5 sigma [or standard deviation from the normal decay rate], either the data is off by a little bit, the theory is off by a little bit, or it’s a hint of something beyond the standard model,” Klein said. “I would say, naïvely, one of the first two is correct.”
To Klein, this variance is inevitable considering the high volume of data run by computers for LHC operations. “With Petabyte-(1015 bytes)-sized datasets from the LHC, and with modern computers, we can make a very large number of measurements of different quantities,” Klein said. “The LHC has produced many hundreds of results. Statistically, some of them are expected to show 2.5 sigma fluctuations.” Klein noted that particle physicists usually wait for a 5-sigma fluctuation before crying wolf — corresponding to roughly a 1-in-3.5-million fluctuation in data.
These latest anomalous observations do not exist in a vacuum. “The interesting aspect of the two taken in combination is how aligned they are with other anomalous measurements of processes involving B mesons that had been made in previous years,” Dr. Tevong You, co-author of the study and junior research fellow in theoretical physics at Gonville and Caius College, University of Cambridge, told Futurism. “These independent measurements were less clean but more significant. Altogether, the chance of measuring these different things and having them all deviate from the Standard Model in a consistent way is closer to 1 in 16000 probability, or 4 sigma,” Tevong said.
Extending the Standard Model
Barring statistical or theoretical errors, Tevong suspects that the anomalies mask the presence of entirely new particles, called leptoquarks or Z prime particles. Inside bottom mesons, quantum excitations of new particles could be interfering with normal decay frequency. In the study, researchers conclude that an upgraded LHC could confirm the existence of new particles, making a major update to the Standard Model in the process.
“It would be revolutionary for our fundamental understanding of the universe,” said Tevong. “For particle physics […] it would mean that we are peeling back another layer of Nature and continuing on a journey of discovering the most elementary building blocks. This would have implications for cosmology, since it relies on our fundamental theories for understanding the early universe,” he added. “The interplay between cosmology and particle physics has been very fruitful in the past. As for dark matter, if it emerges from the same new physics sector in which the Zprime or leptoquark is embedded, then we may also find signs of it when we explore this new sector.”
The Power to Know
So far, scientists at the LHC have only observed ghosts and anomalies hinting at particles that exist at higher energy levels. To prove their existence, physicists “need to confirm the indirect signs […], and that means being patient while the LHCb experiment gathers more data on B decays to make a more precise measurement,” Tevong said. “We will also get an independent confirmation by another experiment, Belle II, that should be coming online in the next few years. After all that, if the measurement of B decays still disagrees with the predictions of the Standard Model, then we can be confident that something beyond the Standard Model must be responsible, and that would point towards leptoquarks or Zprime particles as the explanation,” he added.
To establish their existence, physicists would then aim to produce the particles in colliders the same way Bottom mesons or Higgs bosons are produced, and watch them decay. “We need to be able to see a leptoquark or Zprime pop out of LHC collisions,” Tevong said. “The fact that we haven’t seen any such exotic particles at the LHC (so far) means that they may be too heavy, and more energy will be required to produce them. That is what we estimated in our paper: the feasibility of directly discovering leptoquarks or Zprime particles at future colliders with higher energy.”
Quantum Leap for the LHC
Seeking out new particles in the LHC isn’t a waiting game. The likelihood of observing new phenomena is directly proportional to how many new particles pop up in collisions. “The more the particle appears the higher the chances of spotting it amongst many other background events taking place during those collisions,” Tevong explained. For the purposes of finding new particles, he likens it to searching for a needle in a haystack; it’s easier to find a needle if the haystack is filled with them, as opposed to one. “The rate of production depends on the particle’s mass and couplings: heavier particles require more energy to produce,” he said.
This is why Tevong and co-authors B.C. Allanach and Ben Gripaios recommend either extending the LHC loop’s length, thus reducing the amount of magnetic power needed to accelerate particles, or replacing the current magnets with stronger ones.
According to Tevong, the CERN laboratory is slated to keep running the LHC in present configuration until mid-2030s. Afterwards, they might upgrade the LHC’s magnets, roughly doubling its strength. In addition to souped-up magnets, the tunnel could see an enlargement from present 27 to 100 km (17 to 62 miles). “The combined effect […] would give about seven times more energy than the LHC,” Tevong said. “The timescale for completion would be at least in the 2040s, though it is still too early to make any meaningful projections.”
If the leptoquark or Z prime anomalies are confirmed, the Standard Model has to change, Tevong reiterates. “It is very likely that it has to change at energy scales directly accessible to the next generation of colliders, which would guarantee us answers,” he added. While noting that there’s no telling if dark matter has anything to do with the physics behind Zprimes or leptoquarks, the best we can do is seek “as many anomalous measurements as possible, whether at colliders, smaller particle physics experiments, dark matter searches, or cosmological and astrophysical observations,” he said. “Then the dream is that we may be able to form connections between various anomalies that can be linked by a single, elegant theory.”