IoT by name or nature? Delivering experience over appearance

The last few years have seen a whole raft of IoT vanity projects, where connectivity for connectivity’s sake was the order of the day. Everything from connected loo-roll holders that warned when paper levels were low (if only there was a pre-existing, simpler way), to flip flops that had IoT capability crammed in and called ‘smart shoes’.

The practical use of these types of products being next to zero, many consumers have been driven to despair. And from a business perspective too, IoT by name rather than nature can be damaging. There is the worrying potential for organisations to divert millions of dollars into IoT projects without a clear handle on their objectives, and possibly worse, without a thorough testing plan throughout to ensure the app delivers as intended.

Gartner estimates that by 2020, there will be seven billion connected business devices out there. In this digital transformation boom, companies are investing vast sums in IoT capabilities, and the B2B IoT market is growing fast. But the question remains, how much of this growth actually benefits customers? And how can organisations ensure that when they embark on an IoT project, that the project is useful and consistently delivers the value it should to its intended audience?

Delivering real value

Instead of businesses trying to nail down their own version of what an IoT ‘vision’ should look like, perhaps everyone could be better served by taking a look at those doing it successfully and using this intelligence to optimise IoT offerings from inception to delivery. Like any other mission-critical area, IoT needs a strategy and a vision way before its inception.

Companies like Volvo Car Group seem to be doing things right. Klas Bendrik, their SVP & CIO, was at a recent awards ceremony to receive recognition for the work Volvo is doing with their connected cars and cloud technology, embracing the IoT, when he said: “We take the best available technology and make it work in the most useful way for our customers. It’s about using technology to provide tangible real-life benefits, rather than providing technology just for the sake of it.”

This is exactly the point. Other companies would do well to try and live up to approaches that deliver clear value (in this case, more efficient and/or safer cars, helping people’s journeys). If they can deliver initiatives that have real benefit for customers, this success will make the IoT ever more popular. In turn, this only makes it more valuable and relevant to day to day life and business. Therefore, performance and availability of connected devices will become key differentiators when it comes to an ever more competitive and crowded market place.

Test, test, and test again

The lesson here is that companies investing in IoT have to put the time into doing it right, and in most instances, this means proper monitoring and testing in order to guarantee continuous performance that will actually add true business value. If the idea was to create a digital app to delight customers, then it’s vital to ensure the app delivers against this vision. In short, it needs to work and stand the test of time and popularity. 

We know that connected IoT devices have a high level of dependency on speed of communication. This can open them up to issues such as unreliable network hardware or slow internet connection. Testing IoT devices to make sure that they’re not losing data, failing to respond, and work in any scenario, is imperative.

Key to the customer experience is proactively monitoring your websites and applications, not to mention APIs – and to do it 24/7 rather than intermittently. So, before your valuable customers run into a wall and start making a lot of noise about any availability or performance issues, you can already be fixing the problem. Speed is crucial; performance indicators like page load times are directly linked to a loss of views and visitors – the longer you test people’s patience, the more risk you run of losing their trade.

There are more issues to consider, cyber-crime and data privacy not least amongst them. The downside of the IoT can be a dangerous one – and embarrassing. After all, who wants to get hacked by a kettle? Testing needs to push applications on all areas of performance, including how secure they are for end users.

Making sure that we get the most out of IoT projects shouldn’t rely on an ad hoc process which concerns only a few techie individuals in any given organisation. Not so long ago, Business Insider predicted that the Internet of Things will be the largest device market in the world by 2019. In a year or so, it will be more than double the size of the smartphone, PC, tablet, connected car, and the wearable market combined. By then, let’s hope all those devices are things we need (and love!), and work with 24/7 reliability. Proper testing can enable organisations to take the first step on this journey, and deliver leading customer experience. Latest from the homepage

In Travis Kalanick’s first public appearance since resigning from Uber, his competitive nature was put on trial

Kalanick took the witness stand in Alphabet’s lawsuit against Uber.

The legal saga between tech behemoths Alphabet and Uber began with a bruising condemnation of Uber’s former CEO.

“This case is about one competitor deciding they need to win at all costs,” an attorney for Alphabet said in his opening statement on Monday.

“Mr Kalanick, the CEO at the time at Uber, made a decision that winning was more important than obeying the law,” he continued.

On Tuesday, the second day of trial in the U.S. District Court for the Northern District of California in San Francisco, an uncharacteristically soft-spoken Kalanick took the stand to face these accusations head on.

It’s the first time the notoriously combative former CEO of Uber has spoken publicly since he was ousted by major shareholders in June 2017.

While Kalanick, who donned a dark suit and tie, answered questions about his aggressive ambitions to win the self-driving car race, the normally emotive Uber co-founder appeared restrained even when asked to concede that Google was in the self-driving lead.

The attorney asked if he agreed that Google is the industry leader for autonomous vehicles.

“I think that’s the general perception right now,” he answered.

Kalanick’s testimony will likely be a central part of Alphabet’s argument. Alphabet is alleging Uber worked with former engineer Anthony Levandowski to steal self-driving trade secrets before he left Waymo, Alphabet’s self-driving arm, and created a startup that he would eventually sell to Uber.

The company has to prove that Uber, not Levandowski — who is not a party to the case and has pleaded the Fifth Amendment — misappropriated Waymo’s self-driving trade secrets. That’s why shifting to focus on Kalanick and his — as Waymo has characterized — rapacious nature is important for Waymo’s case.

If Uber loses the case, it could have to pay out millions of dollars in damages and potentially stall its self-driving efforts. For Waymo, losing the case will have largely reputational risks. Alphabet rarely, if ever, sues over any issues with people or other companies, which means this litigation carries a lot of weight.

Waymo’s strategy so far appears to be attempting to prove that Kalanick was fixated on beating Google and winning the self-driving car race at all costs, which could in turn serve to explain his motivation to conspire with Levandowski.

Uber contends that Waymo’s claims are baseless and that none of the files ever made it to the company.

That arrangement to acquire Otto wasn’t exactly a run-of-the-mill deal, however. As Kalanick testified, he had conversations with Levandowski before he started the company about working for Uber.

“Look, I wanted to hire Anthony and he wanted to start a company, so I tried to come up with a situation where he could feel like he started a company and I could feel like I hired him,” Kalanick said when asked about his early conversations with Levandowski.

Kalanick’s desire to hire Levandowski to help lead Uber’s growing self-driving team, first created in 2015, came out of a frustration with the pace of Uber’s self-driving development. As Recode previously reported, Kalanick had been unsatisfied with Uber’s self-driving progress when it became clear that the company would not be able to meet the initial deadline of launching a self-driving pilot in August 2016.

During his testimony, Kalanick echoed his previously publicized feeling that driverless cars were essential for Uber’s success, and the key to scaling a successful fleet of self-driving cars was good lasers.

In a handwritten note submitted as evidence in the case, Kalanick wrote, “laser is the sauce.”

But it’s Kalanick’s emphasis on lasers that Waymo is trying to exploit. Waymo’s attorneys attempted to establish that Levandowski was highly incentivized to achieve very ambitious technical milestones by a certain deadline. Ostensibly, the implication here is he would stop at nothing to meet those milestones, even appropriating trade secrets.

Those technical milestones were set by Uber as part of the terms of acquiring Levandowski’s startup — an acquisition Uber’s then head of autonomous driving John Bares said he had concerns about, according to Bares’s testimony. Each of those technical laser milestones had monetary incentives tied to them.

For instance, if and when Levandowski’s team outfitted a car with a prototype of a long-range laser-based radars, called lidar, that had a visual range of 250 meters, the team would be able to get 6 percent of the approximately $ 590 million sale price, according to a document produced during Kalanick’s testimony.

However, Kalanick also pointed out that they would be able to get that same monetary incentive if the overall mission of the team was successful.

Kalanick’s testimony will continue on Wednesday.

Recode – All

Scientists Are Rethinking the Very Nature of Space and Time

The Nature of Space and Time

A pair of researchers have uncovered a potential bridge between general relativity and quantum mechanics — the two preeminent physics theories — and it could force physicists to rethink the very nature of space and time.

Albert Einstein’s theory of general relativity describes gravity as a geometric property of space and time. The more massive an object, the greater its distortion of spacetime, and that distortion is felt as gravity.

In the 1970s, physicists Stephen Hawking and Jacob Bekenstein noted a link between the surface area of black holes and their microscopic quantum structure, which determines their entropy. This marked the first realization that a connection existed between Einstein’s theory of general relativity and quantum mechanics.

Less than three decades later, theoretical physicist Juan Maldacena observed another link between between gravity and the quantum world. That connection led to the creation of a model that proposes that spacetime can be created or destroyed by changing the amount of entanglement between different surface regions of an object.

In other words, this implies that spacetime itself, at least as it is defined in models, is a product of the entanglement between objects.

To further explore this line of thinking, ChunJun Cao and Sean Carroll of the California Institute of Technology (CalTech) set out to see if they could actually derive the dynamical properties of gravity (as familiar from general relativity) using the framework in which spacetime arises out of quantum entanglement. Their research was recently published in arXiv.

Using an abstract mathematical concept called Hilbert space, Cao and Carroll were able to find similarities between the equations that govern quantum entanglement and Einstein’s equations of general relativity. This supports the idea that spacetime and gravity do emerge from entanglement.

Carroll told Futurism the next step in the research is to determine the accuracy of the assumptions they made for this study.

“One of the most obvious ones is to check whether the symmetries of relativity are recovered in this framework, in particular, the idea that the laws of physics don’t depend on how fast you are moving through space,” he said.

A Theory of Everything

Today, almost everything we know about the physical aspects of our universe can be explained by either general relativity or quantum mechanics. The former does a great job of explaining activity on very large scales, such as planets or galaxies, while the latter helps us understand the very small, such as atoms and sub-atomic particles.

However, the two theories are seemingly not compatible with one another. This has led physicists in pursuit of the elusive “theory of everything” — a single framework that would explain it all, including the nature of space and time.

Because gravity and spacetime are an important part of “everything,” Carroll said he believes the research he and Cao performed could advance the pursuit of a theory that reconciles general relativity and quantum mechanics. Still, he noted that the duo’s paper is speculative and limited in scope.

“Our research doesn’t say much, as yet, about the other forces of nature, so we’re still quite far from fitting ‘everything’ together,” he told Futurism.

Still, if we could find such a theory, it could help us answer some of the biggest questions facing scientists today. We may be able to finally understand the true nature of dark matter, dark energy, black holes, and other mysterious cosmic objects.

9 Physics Questions Baffling Scientists [INFOGRAPHIC]
Click to View Full Infographic

Already, researchers are tapping into the ability of the quantum world to radically improve our computing systems, and a theory of everything could potentially speed up the process by revealing new insights into the still largely confusing realm.

While theoretical physicists’ progress in pursuit of a theory of everything has been “spotty,” according to Carroll, each new bit of research — speculative or not — leads us one step closer to uncovering it and ushering in a whole new era in humanity’s understanding of the universe.

The post Scientists Are Rethinking the Very Nature of Space and Time appeared first on Futurism.


A New Experiment May Finally Reveal the True Nature of Gravity

Unifying Theories

Quantum physicists have toiled to no avail for decades in their attempts to subsume gravity (the first cosmological force ever discovered) into quantum physics. A group of researchers recently proposed a new experiment that could redefine nature of gravity, and transform our understanding of the fundamental forces of the universe.

Quantum mechanics is a theory concerned with all fundamental particles and the forces they undergo, with the exception of gravitational attraction. In order to understand what happens inside a black hole, or what took place during The Big Bang, the theories of gravity and quantum mechanics must be combined. The problem is, whenever this union is attempted, both theories fall apart.

If gravity turns out to be a quantum mechanical force in hiding, then the entanglement we observe in photons could also apply to falling masses. In other words, if two objects (identical in every way besides horizontal position) were in free fall, then measuring the properties of one object could instantaneously affect the other. Sougato Bose of University College London and colleagues have outlined an experiment to test this very hypothesis.

Forking Physics

Consider a neutrally charged particle weighing 10-14 kilograms. Inside this particle is a material with spin, which can go up or down. If you drop the particle through a continuously changing magnetic field, it’s path should be altered according to its inner-spin, almost as if the particle hits a cosmic fork in the magnetic road. Left for up spin and right for down.

As the particle falls, it’s in superposition, or simultaneous occupation, of both paths (think of Schrödinger’s cat, a thought experiment on superposition). Technically, from release, the particle could take both paths (we have to think outside of the confines of linear time for this). Combine all possible paths, and we witness the one state that represents the path where the particles take closest trajectories.

The distance between trajectories should approach no less than 200 micrometers, in order to avoid other, gravity-dominating interactions. A test should show us if their spin components are entangled once these masses are returned to their original states. Thus, such a test is intended to rule out other interfering forces, e.g., the Casimir force, or electromagnetic interactions.

However, Bose said it’s important to note that a lack of observed entanglement does not prove that all gravity is classical — unless the experiment can definitively show that no other interactions with the environment, like collisions with stray photons or molecules, have interfered.

Quantum Cosmos?

Antoine Tilloy of the Max Planck Institute of Quantum Optics in Germany has voiced his approval, though with the caveat that a positive finding will falsify only some classes of theories of classical gravity. “That said, the class is sufficiently large that I think the result would still be amazing,” he told New Scientist.

In the same article, Maaneli Derekhshani of Utrecht University in the Netherlands argued that a bona fide verified null finding would prove gravity does not have quantum roots. “This would then raise tough but interesting questions about how and when exactly gravity ‘turns on’ in the quantum-classical transition for ordinary matter,” Derekshani said. “A null result would be the most surprising and interesting outcome.”

As we acquire additional technological means and scientific evidence, we’ll be closer than ever to answering these age old questions once and for all, making it a truly exciting time for those who pine for a unified theory of physics.

The post A New Experiment May Finally Reveal the True Nature of Gravity appeared first on Futurism.


Scientists Made a Two-Dimensional Material That’s Never Been Seen in Nature

Unlike Anything in Nature

A team of researchers from RMIT University in Melbourne, Australia has reportedly made a “once-in-a-decade discovery” that will radically change how we do chemistry. The discovery? The creation of two-dimensional materials no thicker than a few atoms — something that’s never been seen before in nature.

The research that led to this incredible find was led by Professor Kourosh Kalantar-Zadeh and Dr. Torben Daeneke from RMIT’s School of Engineering. Alongside their students, they worked on the material’s development for over a year.

“When you write with a pencil, the graphite leaves very thin flakes called graphene, that can be easily extracted because they are naturally occurring layered structures,” explains Daeneke. “But what happens if these materials don’t exist naturally? Here we found an extraordinary, yet very simple method to create atomically thin flakes of materials that don’t naturally exist as layered structures.”

Liquid metal droplets. Image Credit: RMIT University
Liquid metal droplets. Image Credit: RMIT University

To create the 2D material, the team dissolved metals in liquid metal to create very thin oxide layers capable of being peeled away. Daeneke explains that process to create the oxide layer is very simply, like “frothing milk when making a cappuccino.” It doesn’t take much technical expertise to do, so anyone could, theoretically, do it — that said, it’s unclear if you actually should.

Improved Electronics

While the new material is expected to be a new tool in chemistry, it also promises to improve our existing electronics. In fact, it’s believed it could enhance data storage capabilities and make electronics faster.

You see, once the oxide layers are peeled away, they could be used as transistor components in modern electronics to make them faster while requiring less power; this is only possible due to how thin they are. Oxide layers are also used to make the touch screens we’ve become intimately familiar with, which suggests that companies could use this material to make more responsive touch screens, or people could even eventually learn to make their own screens.

“We predict that the developed technology applies to approximately one-third of the periodic table,” said Professor Kalantar-Zadeh. “Many of these atomically thin oxides are semiconducting or dielectric materials. Semiconducting and dielectric components are the foundation of today’s electronic and optical devices. Working with atomically thin components is expected to lead to better, more energy efficient electronics. This technological capability has never been accessible before.”

The team didn’t specify exactly how their 2D material might impact data storage, but we can speculate that it could make transferring data faster than ever before. Sony’s latest series of SD Cards are touted as “the world’s fastest,” possessing impressive speeds of 300 MB/s. What if similar capabilities could be given to commercialized hard drives or used to enhance cloud storage?

Of course, if the RMIT University team’s research is anything like the low-power semiconductors Stanford engineers were developing in August, or the one-dimensional material proposed last year by University of Texas scientists, it’ll be some time before we see the 2D material incorporated into everyday hardware. Perhaps even longer than expected, as even the team neglected to mention when their research might be available to be utilized. Once-in-a-decade discoveries do require extensive work post-discovery and additional confirmation before they can change the world.

The post Scientists Made a Two-Dimensional Material That’s Never Been Seen in Nature appeared first on Futurism.


A New Quantum Atomic Clock May Finally Reveal the Nature of Dark Matter

Tick Tock

Scientists at the University of Colorado Boulder’s JILA (formerly the Joint Institute for Laboratory Astrophysics) have developed an incredibly precise quantum atomic clock based on a new three-dimensional design. The project has set a new record for quality factor, a metric used to gauge the precision of measurements.

The clock packs atoms of strontium into a cube, achieving 1,000 times the density of prior one-dimensional clocks. The design marks the first time that scientists have been able to successfully utilize a so-called “quantum gas” for this purpose.

Previously, each atom in an atomic clock was treated as a separate particle, and so interactions between atoms could cause inaccuracies in the measurements taken. The “quantum many-body system” used in this project instead organizes atoms in a pattern, which forces them to avoid one another, no matter how many are introduced to the apparatus. A state of matter known as a degenerate Fermi gas — which refers to a gas comprised of Fermi particles — allows for all of the atoms to be quantized.

“The most important potential of the 3-D quantum gas clock is the ability to scale up the atom numbers, which will lead to a huge gain in stability,” said physicist Jun Ye, of the National Institute of Standards and Technology (NIST), who worked on the project. “We are entering a really exciting time when we can quantum engineer a state of matter for a particular measurement purpose.”

During laboratory tests, the clock recorded errors amounting to just 3.5 parts in 10 quintillion, the first atomic clock to achieve such accuracy.

Watch the Clock

“This new strontium clock using a quantum gas is an early and astounding success in the practical application of the ‘new quantum revolution,’ sometimes called ‘quantum 2.0’,” said Thomas O’Brian, the chief of the NIST’s quantum physics division and Ye’s supervisor. “This approach holds enormous promise for NIST and JILA to harness quantum correlations for a broad range of measurements and new technologies, far beyond timing.” 

What Is Dark Matter?
Click to View Full Infographic

Atomic clocks have clear-cut applications for tasks like time-keeping and navigation. However, the same technology can be used in various different strands of research — like the ongoing effort to better understand dark matter.

It’s been suggested that monitoring minor inconsistencies in the ticking of an atomic clock might offer insight into the presence of pockets of dark matter. Previous research has shown that a network of atomic clocks, or even a single highly-sensitive system, might register a change in the frequency of vibrating atoms or laser light in the clock if it passed through a dark matter field. Given that this project offers much greater stability than its predecessors, it could contribute to new breakthroughs in solving this persistent cosmic mystery.

The post A New Quantum Atomic Clock May Finally Reveal the Nature of Dark Matter appeared first on Futurism.