To Understand The Atmospheres Of Distant Exoplanets, Look To Your Car Engine

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

The next time you’re stuck in a mundane traffic jam, find some excitement in your car engine’s secret identity: it’s actually not so different from the exotic exoplanets in our universe.

Seriously. Stay with me here.

French astronomers discovered that computer models used to simulate how car engines emit pollutants could also be used to model hot exoplanet atmospheres.

An artist's impression of a "hot Jupiter," a gauzy color-streaked planet with a star peeking up behind it. Exoplanets like these can, surprisingly, be modeled by those that simulate car engines.
An artist’s impression of a “hot Jupiter.” Image Credit: NASA, ESA & G. Bacon

The planets in question are scorching goliaths. They’re the size of Neptune or Jupiter, but orbit 50 times closer to their star than Earth does the Sun. This gives them hydrogen-rich gaseous atmospheres of 1,000 to 3,000 degrees Celsius (1,832 to 5,431 degrees Fahrenheit), which whip around at speeds of 10,000 kilometers (over 6,000 miles) per hour.

Under such intense (to say the least) conditions, scientists historically had trouble modeling what chemicals might be found in these atmospheres. As the hellishly hot, insanely fast gasses swirl, they interact in unusual ways  – creating chemicals that don’t fit the typical models astrophysicists use to simulate planets.

Almost shockingly, these extreme temperature and pressure conditions are not so different from those found in car engines. Car engine pollution models can examine temperatures over 2,000 degrees Celsius, along with a wide range of pressures. This makes them flexible enough to study warm exoplanets, too. 

Since 2012, the research team has used these models to create simulations of the atmospheres on hot Jupiters and warm Neptunes, which were then made available to the astrophysics community in an open-access database.

The next step for this research will be to incorporate data from research at particle accelerators, which can provide information on how molecules absorb ultraviolet light at the extreme temperatures of exoplanets — data that was previously only available at room temperature.

“Other fields of research have an important role to play in the characterization of the fantastic diversity of worlds in the Universe, and in our understanding of their physical and chemical nature,” explained Oliva Venot, a lead authors and a researcher at Laboratoire Interuniversitaire des Systèmes Atmosphériques (Interuniversity Laboratory of Atmospheric Systems), in a press release.

These models could help scientists figure out how these far-away exoplanets work without ever being able to reach them. After all, at the moment, our car engines can’t yet transport us out to distant worlds. But they could get us a little closer to understanding them.

The post To Understand The Atmospheres Of Distant Exoplanets, Look To Your Car Engine appeared first on Futurism.

Futurism

Cash For Apps: Make money with android app

Apple doesn’t understand what makes Chromebooks great

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND


Apple announced a new iPad earlier this week at an education-focused event. Though it’s rare for the company to directly acknowledge its competition during presentations, it made no secret the new iPad was aimed at one rival in particular: Chromebooks. The ePad, as some call it, came in at the same $ 329 price tag as the previous model ($ 299 for schools), but includes a faster processor and support for the Apple Pencil. At the same time, Apple announced a suite of new software tools to make the iPad more useful for academia. Apple arguably has more clout than any other…

This story continues at The Next Web

Or just read more coverage about: Apple
Apple – The Next Web

Cash For Apps: Make money with android app

Amazon dreams up a drone that will understand your hand signals

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

Amazon was just issued a patent for a UAV that can interpret gesture and vocal commands, a device that could in theory be used to deliver packages. First spotted by GeekWire, the patent describes a drone-like device outfitted with various sensors, ca…
Engadget RSS Feed
Cash For Apps: Make money with android app

Neil deGrasse Tyson: We Don’t Understand the Most Fundamental Aspects of Our Universe

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

We have nanobots that swim inside our bodies and monitor our vital organs. We have autonomous robots that work alongside human doctors to perform complex surgeries. There are rovers driving across the surface of Mars and, as you read this, thee humans are orbiting high above you, living in the cold vacuum of space.

In many ways, it seems like we’re living in the future. But if you ask Neil deGrasse Tyson, it seems like we’re little more than infants trying to clutch sunbeams in our fists.

At the 2018 World Government Summit in Dubai, Tyson gave a presentation to an enraptured audience. The topic? How humans will — most definitely not — colonize Mars (Tyson, if you aren’t aware, is an eternal skeptic). It seems fitting then that, following his rather depressing speech, he took the time to discuss how humans are, in many ways, entirely ignorant.

Here are three things that, according to Tyson, show just how far we have to go:

Dark Matter

A portion of our universe is missing. A rather significant portion, in fact. Scientists estimate that less than 5% of our universe is made up of ordinary matter (protons, neutrons, electrons, and all the things that make our bodies, our planet, and everything we’ve ever seen or touched). The rest of the matter in our universe? Well, we have no idea what it is.

“Dark matter is the longest standing unsolved problem in modern astrophysics,” Tyson said. He continued with a slightly exasperated sigh, “It has been with us for eighty years, and it’s high time we had a solution.” Yet, we aren’t exactly close.

The problem stems from the fact that dark matter doesn’t interact with electromagnetic radiation (aka light). We can only observe it because of its gravitational influence — say, by a galaxy spinning slower or faster than it should. However, there are a number of ongoing experiments that seek to detect dark matter, such as SNOLAB and ADMX, so answers may be on the horizon.

Dark Energy

Dark energy is, perhaps, one of the most interesting scientific discoveries ever made. This is because it may hold the keys to the ultimate fate of our universe. Tyson explains it as “a pressure in the vacuum of space forcing the acceleration of the [expansion of] the universe.” Does that sound confusing? That’s probably because it is.

If you weren’t aware, all of space is expanding — the space between the galaxies, the space between the Earth and the Sun, the space between your eyes and your computer screen. Of course, this expansion is minimal. It’s so minimal that we don’t even notice it when we look at our local solar system. But on a cosmic scale, its impact is profound.

Because space is so vast, billions of light-years of space are expanding, causing many galaxies to fly away from us at unimaginable speeds. And if this flight continues, eventually the cosmos will be nothing more than a cold unendingly dark void. If it reverses, the universe will collapse in on itself in a Big Crunch.
Unfortunately, we have absolutely no idea which will happen, as we have no clue what dark energy is.

Abiogenesis

We know a lot about how life evolved on Earth. About 3.5 billion years ago, the earliest forms of life emerged. These single-celled creatures dominated our planet for billions and billions of years. A little over 600 million years ago, the first multicellular organisms took up residence. The Cambrian explosion followed soon after and *boom* the fossil record was born. Just 500 million years ago, plants started taking to land. Animals soon followed, and here we are today.

However, Tyson is quick to point out that we don’t understand the most vital component of evolution — the beginning. “We still don’t know how we go from organic molecules to self-replicating life,” Tyson said, and he noted how unfortunate this is because “that is basically the origin of life as we know it.” The process is called abiogenesis. In non-scientific jargon, it deals with how life arises from nonliving matter. Although we have a number of hypotheses related to this process, we don’t have a comprehensive understanding or any evidence to support one.

There we have it. The biggest mysteries of the cosmos just happen to be some of the most important and fundamental. So, when will we finally figure out these scientific conundrums and move out of our infancy? Tyson refuses to make a prediction.

If there’s one thing he knows, it’s how very little humans actually know: “I’m not very good at predicting the future, and I’ve looked at other people’s predictions and seen how bad those are even among those that say ‘I am good.’ So I can tell you what I want to happen, but that’s different than what I think will happen.”

The post Neil deGrasse Tyson: We Don’t Understand the Most Fundamental Aspects of Our Universe appeared first on Futurism.

Futurism

Cash For Apps: Make money with android app

Sony made a projector helmet to help you understand mosquitoes

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

Every year at SXSW in Austin, TX, Sony shows off a bunch of experimental projects. This year, the company introduced its Superception Head Light system, which is basically a helmet with a Sony MP-CL1 projector sticking out the front and attached Sony MDR-XB950 headphones on the side. Someone at Sony just strapped a bunch of devices together and attached a HTC Vive tracker to the back so that wearers can move around the room while the projection follows their movements.

Sony says the device is supposed to demonstrate how technology can affect human perception through our various senses. This demo attempts to teach wearers about how animals use their senses to get around the world — like how mosquitos use smell to find blood. The…

Continue reading…

The Verge – All Posts

Cash For Apps: Make money with android app

Actions on Google Assistant now understand 7 new languages

Google is also present at MWC 2018 to unveil new technology. The company just announced major improvements in the Actions department, responsible for booking tickets and adding reminders on Google Assistant. Starting this week, Actions can be built in seven new languages, bringing the total to 16 – Hindi, Thai, Indonesian, Danish, Norwegian, Swedish and Dutch. They join English, French, German, Japanese, Korean, Spanish, Brazilian Portuguese, Italian and Russian. The 16 supported languages can be used for developing localized apps for the regions where the languages are spoken. From…

GSMArena.com – Latest articles

NASA May Stop Trying to Understand Dark Energy

First To Fall

In recent weeks we’ve heard how the Trump administration’s proposed NASA budget might affect the future of the agency’s projects. The International Space Station could be eyeing its last seven years in service; its funding will likely not be extended beyond the mid-2020s. Now, another NASA initiative is on the chopping block.

The new budget, if passed, will defund the Wide-Field Infrared Survey Telescope (WFIRST), a high-priority project by a blue-ribbon panel from the National Academy of Sciences in 2010. The telescope was set to launch in the next decade, helping astronomers in their quest to explore an expanding universe and unravel the mysteries behind dark energy.

NASA’s acting administrator, Robert M. Lightfoot Jr., described the cut as “one hard decision,” according to a report from the New York Times. He stressed the need to reallocate the telescope’s funds — the project is set to cost more than $ 3 billion in total — into other areas of research.

Astronomers have harshly criticized plans to nix WFIRST’s funding. A statement from the American Astronomical Society suggested that NASA’s budget reductions could “cripple U.S. astronomy.”

“A handful of people within the bureaucracy” David Spergel, former chairman of the academy’s Space Study Board, told the New York Times, “have overturned decades of community-driven processes and tried to set the direction for space astronomy.”

Learning more about dark energy — a cosmological force that makes up 68 percent of the universe — could have a profound impact on our knowledge of how and why our universe is expanding. Scientists want to delve deeper into its intricacies, but need better tools to do so. That’s where WFIRST comes in.

WFIRST’s original mission timeline was pushed back because of delays to the James Webb Telescope launch, which went well over budget. When it became clear that WFIRST wasn’t going to launch on schedule, NASA purchased a share of a spacecraft called Euclid, a mission to explore dark energy spearheaded by the European Space Agency. But the Euclid mission isn’t expected to be as comprehensive as WFIRST, and NASA will have to rely on an outside agency for dark energy data. Without a wholly NASA-based mission, our nation’s dark energy research will suffer.

It’s no secret that the Trump administration wants NASA to focus on sending astronauts to the Moon  — but now it seems clear that prioritizing that kind of attention-grabbing program might come at the cost of other, perhaps more important, research.

Of course, there’s no guarantee that Congress will approve this budget in its current form. Despite the administration’s sweeping cuts and proposed reallocations, the clock has yet to strike midnight on the WFIRST mission.

The post NASA May Stop Trying to Understand Dark Energy appeared first on Futurism.

Futurism

The Emerging Marketing Intent Graph: How to Understand and Manage Consumer Preferences

The following is a guest contributed post by Sameer Patel is the CEO of Kahuna
 
The battleground for understanding and managing consumer preferences is heating up. The recent acquisition of consumer identity management provider Gigya by SAP for $ 350 million is an encouraging sign, showing proof that consumer marketers are increasingly interested in deeply understanding how the consumer wants to traverse the purchase journey.
The acquisition got me thinking about how the adoption of very useful but very fractured consumer engagement touchpoints over the past decade has left us with a woefully chaotic view of consumer intent. Here’s what the marketer is experiencing:
 
  1. Weak signals from spray-and-pray email, largely limited to opens and click-throughs.
  2. Shaky purchase intent gleaned from fly-by-night gestures such as “likes” inside the walled gardens of social networks.
  3. Rich, near real-time but siloed digital breadcrumbs from mobile experiences that are disconnected from a batch & blast-style email marketing technology stack.
  4. Even richer yet even more disconnected insights captured via browse & cart abandonment resulting in a staggering drop off of 7 out of 10 transactions, because your e-commerce system can’t effectively take the baton from your email or mobile marketing system to maintain a consistent thread of the customer’s journey.
  5. A lack of preparation for the coming tsunami of emerging engagement channels in the next 12 months: beacons, chatbots, AR/VR, voice, and ironically, what Forrester analyst and retail expert Brendan Witcher describes as an uninspiring effort by brick & mortar to bring its sexy back.
The lure of being able to capture the sheer scale of customer expression from this wide array of touchpoints no doubt gives digital commerce providers some serious X factor. But the single most glaring inefficiency in all of this is that every marketer has lost a single identity layer that exposes true intent.
 

Consumers Are Complicated

 
The reality is this: Some of us research at certain times and buy at other times. Some find products and research products on different devices or physical locations. Others want to be marketed to based on what similar buyers have browsed or bought. And for yet another group, even the incessant bombardment of untargeted emails won’t be able to trigger a purchase; instead, a simple external event such as good weather or a baseball game this weekend might be enough motivation. You get my drift—consumers are a complicated bunch. We’re all different.
 

Introducing the Intent Graph

 
True identity comes from building a dynamic Intent Graph by gathering gestures across all touchpoints in near real time and orchestrating a meaningful 1:1 personalized experience for every individual consumer.
 
 
If there is a single common thread I’ve seen across modern marketers who are giving established brands a run for their money, it is the ruthless prioritization of the marketing stack around the Intent Graph. Everything else—segmentation, campaign design and execution, and message delivery—must work to maximize the value of the Intent Graph.
The good news is that this doesn’t need to be as daunting as it sounds, and it doesn’t require the complete replacement of your existing email, web, or mobile marketing automation technology. Leading digital brands are simply injecting an intelligent layer that can make sense of consumer gestures and make machine-driven orchestration decisions about how to engage consumers, in real time.

A Dynamic Future Awaits Us

 
And by the way, this is just the beginning. The Intent Graph will be extremely dynamic. Emerging touch points such as beacons will express intent with situational awareness. The growing popularity of voice assistants such as Amazon Echo or Google Home will enable consumers to express intent with an added emphasis on tone and emotion. And if that’s not enough, AR/VR wipes out these seemingly physical and digital constraints and lets the consumer express tone, emotion, and situational awareness without ever leaving her living room. The Intent Graph keeps building from her couch.
 

CMOs Must Focus on Consumer Intent, Not on the Process

 
CMOs at some of the most iconic brands of the past lost the plot when they focused on the process and not on the consumer’s intent. This scathing characterization by Mark Bonchek and Gene Cornfield in Harvard Business Review about the reasons behind the removal of Coca Cola’s CMO sounds like a career obituary that may well become the rule and not the exception:
 
“Coca-Cola—widely regarded as one of the top marketers in the world—recently eliminated the role of CMO and replaced it with a Chief Growth Officer. The previous CMO was known for his focus on campaigns and was thanked for ‘improving the productivity of marketing’ and leading a ‘resurgence in the quality of advertising.’ In contrast, the CEO explained the leadership changes as necessary to ‘respond to the fast-changing needs’ of customers, employees and partners and to ‘transform our business for the future.’”
 
Ouch. But it unapologetically reinforces my point.
 

Final Thoughts

 
Channels will come and go. Campaigns and delivery will always matter. But a 1:1 personalized experience based on an ever-fluid Intent Graph for every individual consumer will be the digital commerce battle ground for e-commerce brands and marketplaces in 2018.

ABOUT THE AUTHOR
Byline: Sameer Patel is the CEO of Kahuna, a leading AI-powered consumer marketing software provider serving iconic digital commerce brands and online marketplaces. Sameer is @sameerpatel on Twitter.

The post The Emerging Marketing Intent Graph: How to Understand and Manage Consumer Preferences appeared first on Mobile Marketing Watch.


Mobile Marketing Watch

Empathic AI: The Next Generation of Vehicles Will Understand Your Emotions

Empathic_Vehicle-Sensum-3144x2089

Transportation will never be the same once our machines know how we feel.

We are all entering a wholesale, global disruption of the way people move from one place to another.  More than any other change in this sector, the one that is likely to have the most significant impact on human society is the rise of autonomous (i.e., self-driving) vehicles.

The transportation industry, historically dominated by a handful of large vehicle-manufacturing brands, is evolving into an ecosystem of ‘mobility services,’ underpinned by artificial intelligence (AI). A major step in the development of AI is to give it ‘empathy,’ allowing our physiological and emotional states to be observed and understood. This connection will mostly be achieved by connecting to wearable or remote sensors, the same way that fitness bands allow our physical state to be monitored.

By feeding this sensor data into AI systems, we can train them to know how we feel and how to respond appropriately. This kind of empathy can also be enhanced by giving AI its own artificial emotions, imbuing it with simulations of feelings.

Empathic technology will have no small effect on the mobility sector. How might an empathic vehicle look?

Safe Travels

There is already a growing body of research from top-tier auto companies into what kind of empathic interactions will protect drivers, passengers and everyone around them from harm. To investigate this, biometric sensors, cameras, and microphones are being used to detect:

  • Fatigue & drowsiness: e.g., monitoring head or eye movements, posture or heart/breathing rate.
  • Distraction: e.g., gaze detection to ensure the driver is watching the road.
  • Intoxication: e.g., using infrared optics or analyzing voice or breath.
  • Medical incidents: e.g., detecting a potential cardiac event from a wearable heart-rate sensor.

A Comfortable Journey

After ensuring the safety of the humans in the system, empathic tech can be employed to optimize the ride experience. There is a universe of auto-suppliers you’ve probably never heard of, who build all the components and systems that end up in the well-known vehicle brands. They are leading the way to a more empathic ride, with innovations such as:

  • Environmental controls: e.g., lighting, heating, AC, sound and olfactory output, customized to suit your current mood.
  • Physical controls: seat position, engine configuration, etc.
  • Humanising AI feedback: the virtual assistants like Alexa and Siri that are invading our homes and phones are also reaching into our vehicles. With empathic AI we can tailor their feedback to suit our preferred style of interaction.

An Entertaining Ride

Now that our customer is safe and comfortable, they can benefit from AI that knows how to push the right emotional buttons at the moment. This is particularly likely to apply to the onboard music and infotainment systems. Other subtle ways in which a vehicle could be designed to optimize the thrill of the ride include offering to increase the engine’s power output when the driver is feeling confident and happy.

The New Norms of Autonomous Society

An autonomous vehicle doesn’t exist in a bubble. Much of its intelligence is based on sensing its environment and making rapid judgments about how to act. Each vehicle will also be integrated with a global network of systems, able to share information ranging from weather forecasts to road obstructions. By connecting each vehicle to its neighbors and the wider world, we will see the emergence of a new type of ‘social’ structure with its own norms of behavior.

This AI-driven ‘society’ will involve interactions not just between the vehicles and their drivers or passengers, but also with onboard devices, nearby pedestrians, other vehicles, and their occupants, as well as surrounding infrastructure. The etiquette and rules of what the market calls ‘vehicle-to-everything’ (V2X) communications will establish themselves as we gradually let go of the wheel and hand our mobility needs over to ‘the machines.’

This mobility ecosystem is also likely to share data and processes with the rest of the AI in our lives, such as in our smartphones and home-automation systems. If coordinated correctly, this unified data architecture would allow empathic vehicles to know us much better, behaving ever more like a trusted friend.

This is not just a technological problem; it’s a monumental user-experience challenge too. Gradually increasing the empathic capability of the system will support the evolution of the transport experience towards one that is not only safe and comfortable but also delightful.

The future of mobility is emotional.

Editor Note & Disclaimer: The author is a member of the Sensum team, which is an alumnus of our ReadWrite Labs accelerator program. 

The post Empathic AI: The Next Generation of Vehicles Will Understand Your Emotions appeared first on ReadWrite.

ReadWrite

Burger King thinks a Whopper can help you understand net neutrality — it really can’t


Net neutrality is the current cause du jour, and rightfully so. If undermined or eliminated, it could have lasting consequences for tech innovation in America. The thing is, it’s a really complicated topic. If you’re not particularly technologically clued up, you might struggle to grasp it. Burger King is an unlikely champion for Net Neutrality, and has created a video that “explains” it through an even unlikelier device: the Whopper. If you’ve ever wondered what it looks like when a fast food company wades ineptly into a topic it scarcely understands, you’ll probably want to watch this: The premise is…

This story continues at The Next Web
The Next Web