Infinix Zero 5 Battery Life Test – #OneChargeRating

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

Infinix launched the Infinix Zero 5 smartphone recently. We already brought you the review of the phone, here we have the battery life test results of the phone. It packs a 4350mAh built-in battery with support for xCharge 18W fast charging, has a 5.98-inch 1080p display, is powered by Octa-Core MediaTek Helio P25 16nm SoC, has 6GB of RAM and runs on 7.0 (Nougat) with XOS 3.0. Check out the test results below. Talk Time It lasted for 33 hours and 34 minutes in our talk time test. 3G Browsing It lasted for 10 hours and 7 minutes in our 3G browsing test, topping the charts. WiFi Browsing It lasted 12 hours and 48 minutes in our WiFi Browsing test. Video Playback It lasted for 14 hours and 15 minutes in our video playback test. Charging Time It took 2 hours 22 mins for charging it from 0 to 100% since it has fast charging, and 0 to 50% takes 46 minutes. Standby Time It lasted for 57 days in our standby test. In achieved a One Charge Rating of 17 hours and 49 minutes, which is good for a phone with a 4350mAh battery.  Check out our battery life test procedure, to know more about our tests in detail.
Fone Arena
Cash For Apps: Make money with android app

Vivo Apex hands-on: pop-up selfie cam, in-screen fingerprint scanner, and practically zero bezel

Cash For Apps: Make money with android app

In-display fingerprint scanners on phones had been prophesied for some time, but few expected relatively unknown Chinese manufacturer Vivo would be the first to release a consumer product to the market. The X20 Plus UD went on sale last month, exclusively in China, for around $ 565.

By all accounts, the fingerprint reader works pretty well, although the phone was mostly unremarkable otherwise. It shipped with dual rear cameras, a 1080p 6.43″ OLED panel, and a Snapdragon 660.

Read More

Vivo Apex hands-on: pop-up selfie cam, in-screen fingerprint scanner, and practically zero bezel was written by the awesome team at Android Police.

Android Police – Android news, reviews, apps, games, phones, tablets

Brilliant Ideas That Buy Us Time Before Cape Town Reaches Day Zero

In just over 100 days, researchers expect Cape Town to run out of water. The South African megacity has traditionally enjoyed abundant rains during winter and a warm, pleasant climate during summer, but after three years of drought, experts now expect the city’s water system to collapse on June 4, 2018.

Currently, Cape Town’s citizens have access to 13.2 gallons of water per day, about a quarter of what the average American uses daily and equivalent to little more than a six-minute shower. Experts predict water levels in the six dams that feed the city will fall below 13.5 percent of their capacity on June 4, effectively leaving the city dry. On that “Day Zero,” nearly four million residents could see their water rationed to 6.6 gallons per day per person.

As time runs out for Cape Town, the brightest minds in Africa and beyond are scrambling for last-ditch solutions to stave off the crisis. Here are some of the most ambitious ideas that may buy time while the city sets up desalination plants and hopes for rain.

Image Credit: Creative Commons

Proposed solution: Drag an iceberg down to Western Cape

What is it? As wild as it sounds, computer simulations have shown we could transport a mammoth clump of ice thousands of miles while retaining more than half its mass during the journey. A 2009 study by French software firm Dassault Systemes showed that it is possible to tow an iceberg with a weight of seven metric tons from the Canary Islands to the northwest coast of Africa in under five months and with a loss of only 38 percent of its mass.

With a budget of approximately $ 10 million, engineers could fit the iceberg with an insulating skirt to reduce melting and then tether it to a boat traveling at a speed of about one knot (1.1 miles per hour).

How will it help? After docking, the water would have to be distributed across the city, which would presumably add to the staggering costs of the operation. However, with Day Zero looming, the investment may be worth it. The Abu Dhabi-based National Adviser Bureau told Gulf News that the average iceberg could provide up to 20 billion gallons of fresh water. The Daily Maverick did the math: If such a wild idea were to come true, it could solve Cape Town’s water crisis for almost half a year.

Image Credit: I-Drop Water

Proposed solution: I-Drop Water

What is it? Launched in 2015, this South African nonprofit distributes water purification systems to shops and grocery stores as a way to reach the most people while keeping water affordable. The filters remove viruses, bacteria, and sediments from the water, and a central station monitors the distribution units through an embedded SIM card, so shopkeepers pay for each gallon of dirty water they purify and sell.

How will it help? In the wake of the Cape Town crisis, Swedish group Bluewater, which also produces water purifiers, teamed up with I-Drop to distribute more systems across southern Africa. The goal is to keep the price of water lower compared to single-use bottles, while also tackling the growing problem of plastic pollution in the continent’s main urban centers. When Day Zero hits Cape Town and people are left with less than seven gallons a day, recycled dirty water could provide a life-saving top up.

Image Credit: Greenchain Engineering

Proposed solution: Greenchain Engineering

What is it? The South African startup targets the entire water supply chain, providing rain harvesting systems as well as improving the management of gray water, which is not drinkable but can be used to wash dishes, run a washing machine, or take a shower. Although a lack of rain is currently Cape Town’s main problem, citizens could harvest rain from short showers from their roofs. The system filters and distributes water collected this way.

How will it help? The startup has started a conversation with the Cape Town municipality to roll out their services on a large scale. While equipping every roof with a water harvesting system may not stop Day Zero from arriving, it could help the city manage the limited resources left and potentially prevent future crises once the rain comes back.

Image Credit: Warka Water

Proposed solution: Fog catchers

What is it? They come in various forms, from a simple square sail stretched between two poles to a complex tent-like structure, but their goal is the same: capture every droplet of moisture in the air and turn it into drinkable water.

The intricate fabric of a fog catcher traps condensation, be it from post-rain humidity or morning mist, and channels it into a container. The devices are designed to meet the needs of remote communities that have to rely on erratic rains for their daily water supply, and they’ve proven so popular people now use them everywhere in the world, from South America to Africa.

How will it help? Innovator Grant Vanderwagen is piloting a simple version of fog catchers in Cape Town. Although the idea is still little more than a proof of concept, the entrepreneur told VentureBurn that a single unit could produce up to 10,000 liters (2,200 gallons) of water per month, depending on the weather.

Image Credit: Creative Commons

Proposed solution: #defeatdayzero

What is it? While the H2O (Hack Two Day Zero) hackathon held in Cape Town on February 9 and 10 was not a solution in itself, the array of fresh ideas generated could help the city get through the crisis.

How will it help? The participants had two days to work together and come up with a prototype that addresses short and long term implications of severe drought. The winning team created Tiny Loop, a battery-operated shower to prolong showers while using less water. They now have a cash prize to use to bring their project to life, and it could help citizens maintain proper hygiene during the crisis.

Image Credit: Tiny Eco / @TinyLoopSA

The post Brilliant Ideas That Buy Us Time Before Cape Town Reaches Day Zero appeared first on Futurism.


Xperia XA1 family’s Oreo updates will ditch built-in blue light filter, meaning that zero Sony phones will come with any form of ‘night mode’

Cash For Apps: Make money with android app

These days, many manufacturers include some sort of blue light filter or “night light” in their phones’ ROMs. Google, Samsung, OnePlus, and some other companies are on the list, and Sony was as well with its “Good night actions” function in Xperia Actions for phones in the XA1 family. However, following the impending Oreo updates, no Sony phone will have a built-in blue light filter.

This news comes by way of Sony Xperia’s official Twitter account.

Read More

Xperia XA1 family’s Oreo updates will ditch built-in blue light filter, meaning that zero Sony phones will come with any form of ‘night mode’ was written by the awesome team at Android Police.

Android Police – Android news, reviews, apps, games, phones, tablets

Ford’s vision for driverless police cars offer zero chance to flirt your way out of a ticket

A patent from Ford revealed ideas for autonomous police cars which are capable of finding law-breakers, doling out tickets, and even waiting in hiding spots. The patent, filed in 2016 and spotted by Motor1 last week, details all the ways in which an autonomous police car could help catch law-breaking drivers. The language specifically proposes a future in which autonomous vehicles are more common, and what role police vehicles would play: While autonomous vehicles can and will be programmed to obey traffic laws, a human driver can override that programming to control and operate the vehicle at any time. When…

This story continues at The Next Web
The Next Web

Why ‘all hands’ customer support makes zero sense

In the software world and perhaps beyond, it has become increasingly popular to implement an “all hands” approach to customer support. The thinking goes that if everyone on a team spends some time at the support desk engaging with customers and solving their problems they will better understand how the customer uses the product and what they’re actually trying to accomplish. Customer empathy and will increase and team members will go about their actual role in a more customer-centric way. Engineers will engineer with the customer in mind because they’ve interacted with them. Marketers will market to the customer more…

This story continues at The Next Web
The Next Web

Increase Performance in Cross-Browser Testing with Zero Effort – Here’s How


Are you used to getting a certain amount of data from your testing practices? Did you know that today you can extract more data from your existing testing practice…with zero additional effort. This all plays into the shift left movement that delivers insight – earlier and easier. When thinking about shifting left, you should answer one of these questions:

1. What new insights can I gain earlier?
2. How easy is it to implement?

Shifting left performance activities are top of mind for many engineering teams. The reason for this trend is because late discovery of extreme application latency typically leads to brand compromises on user experience in favor of time to market and/or the release may be delayed allowing an extended code undoing, a very expensive task for developers, and one teams are looking for eliminate.

The Challenge

There are various reasons why performance activities are usually done late, or outside the development cycle. Some of these include team structure, outdated perception of performance tests or the tools that are being used. This article will describe a web timing approach vs. the motivation and approaches to shifting left performance activities.

Web Page Timing

These are page level stats. Web page timers, defined in W3C Navigation Timing Specification isn’t necessarily new, however, they are very helpful in optimizing web content for various pages and browsers. The data is extremely detailed and readily available for analysis with almost all browsers supporting the API so you don’t need any special setup to collect and report these metrics.

Grabbing the page timers is fairly easy, simply leverage the following:

Map<String,String> pageTimers = new HashMap<String,String>();
Object pageTimersO =  w.executeScript(“var a =  window.performance.timing ;     return a; “, pageTimers);

Here’s an example of the timers resulting from a single page load:




Processing the timers can be done as follows:

long navStart = data.get(“navigationStart”);
long loadEventEnd = data.get(“loadEventEnd”);
long connectEnd = data.get(“connectEnd”);
long requestStart = data.get(“requestStart”);
long responseStart = data.get(“responseStart”);
long responseEnd = data.get(“responseEnd”);
long domLoaded = data.get(“domContentLoadedEventStart”);

this.duration = loadEventEnd – navStart;
this.networkTime = connectEnd – navStart;
this.httpRequest = responseStart – requestStart;
this.httpResponse = responseEnd – responseStart;
this.buildDOM = domLoaded – responseEnd;
this.render = loadEventEnd – domLoaded;

Now that we’ve got the page-level timers, we can store them and drive some offline analysis:




You can even decide within the test if you want to examine the current page load time or size, and pass/fail the test based on that:

// compare current page load time vs. what’s been recorded in past runs
public boolean comparePagePerformance(int KPI, CompareMethod method, WebPageTimersClass reference, Long min, Long max, Long avg){
case VS_BASE:
System.out.println(“comparing current: “+duration +” against base reference: “+ reference.duration);
return (duration – reference.duration) > KPI;
case VS_AVG:
System.out.println(“comparing current: “+duration +” against AVG: “+ avg);
return (duration – avg) > KPI;
case VS_MAX:
System.out.println(“comparing current: +duration +  against AVG: “+ max);
return (duration – max) > KPI;
case VS_MIN:
System.out.println(“comparing current: +duration +  against min: “+ min);
return (duration – min) > KPI;
System.out.println(“comparing current: +duration +  against AVG method was not defined N/A: “+ avg);
return false;}}

Web Page Resource Timing

So far, we’ve been talking about the page level timing. Web page resource timing is a more in depth review of both your code as well as any third party code that you are using.. This is good data because you can detect latency in page performance across any page and any browser, and already get a direction whether the issue relates to DNS discovery, content lookup, download etc.

In reality, when you’re doing this in cycle, the big changes will come from the content that is being downloaded. Large images downloaded to small screens over cellular networks, downloads of non-compressed content, repeated downloads of JS or CSS etc.

Expert Tip:

(How can developers get immediate actionable insight to optimize the page performance? This is where the resource timing API comes to play.  There are great insights about every object that the browser requests: the server, timing, size, type etc. )

Again, to obtain access to the resource timing object, all that needs to be done is follow the following:

List<Map<String, String>> resourceTimers = new ArrayList<Map<String, String>>();
ArrayList<Map<String, Object>> resourceTimersO =   (ArrayList<Map<String,Object>>) w.executeScript(“var a =  window.performance.getEntriesByType(resource) ;     return a; “, resourceTimers);

And here’s an example of the data that is available. Lots of good stuff in here:







Each page would have a long list of resources like the above. You can summarize all the objects into types and produce a summary of totals and some distribution stats:

Below, for example, one can summarize the resources by type for each execution:

Or finally, simply gain access to all the resources directly:

Execution Time Comparison/Benchmarking

So far, we’ve gotten access to the raw data and conducted some level of analysis with it. At the beginning of this article we defined shift left as ‘deliver insight, early and easily’. Now, how about, given a web page, we set a ‘baseline’, and from then on, every execution, we would measure the responsiveness, provide a pass/fail, and provide a full comparison of the current page data vs. the ‘baseline’. Well, with a little code, that’s possible too:

Here’s the top-level, page level summary of current vs. ‘baseline’ run:

There isn’t a material difference in the number of items, but you can see that the page load time is almost 3 seconds longer. From a first look, it seems the rendering time is the one extended.


Now, here’s the comparison between the type summary:





This table compares the total items, size and duration by type against the baseline. It’s not surprising there aren’t any new types of content introduced in this page, nor are there massive changes in the number of elements per type given the last run was just a few days earlier.

Still, despite the fact that in total there is only one additional image, it appears images drive the most latency in loading the page.

To take a closer look, here are the images with the largest load time.





Interestingly, also images that were part of the older page, still took longer:


Putting It All Together

As we’ve seen, it’s possible to examine page responsiveness across different browsers. It’s also possible to compare the page and resource metrics against a previous run to extract action for optimization, or detect a defect. The nice thing is that this can be done for any test: smoke, regression, even production. It does not require any additional infrastructure as it simply runs within the target browser. Results can be embedded into your reporting solution and overall, performance can be part of your agile quality activity.

Code reference

The code used for this project is available as open source at

Follow up projects

  • More Performance Activities
    • HAR file: In addition to direct analysis of the page resources and metrics, it is also possible to analyze the HAR file. Unfortunately, it doesn’t seem like there are API-based analyzers readily available (most are web UI-based tools) but perhaps one can be built.
  • OCR-Based Analysis: Some tools (including Perfecto) offer visual based analysis for measurement of actual content render time. The accuracy of such measurement isn’t as high and the details available aren’t as easily translatable into action. Still, it’s a good method to measure user experience performance across screens. The OCR approach works well also for native apps.
  • Other tools: Google page speed, YSlow etc.
  • Other
    • Security: Similar to the performance testing, given the servers and resources downloaded are detailed in the logs, it should be possible to indicate the set of servers and countries contributing to this web page. Possibly not all are acceptable, that would be good to know and easy to add to the agile cycle.

Looking for more information on performance testing? Click here to read.

Perfecto Blog

CleanEmail is like a Hyperloop to Inbox Zero Town

Having had the pleasure of working with TNW full-time for over three years now, I’ve accumulated more email in my work inbox than I ever thought I’d receive in a lifetime. I remember trying to develop good habits to keep it clean back when I started here, but they were all in vain – and now I’ve got some 70,000 messages taking up more than 70 percent of my allotted 30GB of G Suite storage. I’ve devised a number of search filters to help with all that unwanted mail, but Gmail isn’t designed for triaging messages in such large volumes…

This story continues at The Next Web
The Next Web