The smartphone market is extremely saturated at this point, so it’s not often we see something actually unique. Xiaomi makes plenty of regular phones, but it has also produced a few wild devices like the Mi Mix family. It appears that another strange phone from Xiaomi is on the way – one tailored to gamers.
For anyone who suffers from a long wait while attempting to upload large video and photo files onto their computer, there’s a new Kickstarter for a portable hard drive that says it can back up your files and offer faster on-the-go file uploading services you can use right after taking photos. The company suggests you can do away with bringing your laptop to outdoor photo shoots and instead use its mobile app for editing files and uploading to Dropbox.
The Kickstarter is for a device called the Gnarbox 2.0, which lets you insert your SD card into the hard drive and use it as an additional backup tool. It’s an upgrade to its predecessor, the Gnarbox 1.0, which already offered a mobile large file storage and uploading solution also through…
Few hours back OnePlus released a teaser saying that the phone will have “the speed you need,” and also confirmed the OnePlus 6 name. Now, in a new post CEO Pete Lau discussed about the performance of the upcoming flagship and said that that company has created specialized team called Team FSE (Fast, Stable, Efficient) to focus solely on maximizing the power of our flagship devices. In addition to Snapdragon 845 SoC, he also confirmed 8GB of RAM with 256GB storage version of the smartphone. It is not clear if the company will have the 8GB RAM with 128GB storage version as well. In the forum post, Pete Lau, CEO and Founder of OnePlus said: Understanding the relationship between people and smartphones, we brainstormed with our community of OnePlus users to learn what the users want most – a flexible, light and quick experience. Keeping these parameters at heart, we have conceptualized the OnePlus 6. To make this experience a reality we have established within our ranks a special R&D unit, called Team FSE (Fast, Stable, Efficient). The one goal of this unit is to ensure that we can transcend the current norm with providing our users with what we believe to be a … Fone Arena
OnePlus has taken to its forum to confirm that the upcoming OnePlus 6 (whose name was recently confirmed in a tweet to no one’s surprise), will be coming in a 256GB variant. Apart from that, the company also confirmed that it will have 8GB RAM (same as the last two models) and run on the new top-of-the-line Qualcomm Snapdragon 845 processor. Okay, then And that’s pretty much it. OnePlus did spend far too many words in its forum post describing this one small feature, going on about the “pursuit of burdenless speed” (whatever that means) and also how it has a FSE Team, which stands…
OnePlus CEO Pete Lau has confirmed that the OnePlus 6 will include a Snapdragon 845 processor as well as up to 8GB of RAM and 256GB of storage. “We believe a truly ‘burdenless user experience’ can transcend the current norm,” Lau said. “In experiential terms this means the phone functions just the way you expect it to, without lag or disturbance. In design terms, a focus on beauty in simplicity, with no unnecessary features added.”
“Through this process, we have realized that a truly fast and smooth user experience can realize a burdenless experience,” the CEO explained.
Rumors have suggested that the OnePlus 6 could have a Snapdragon 845, 8GB of RAM, and 256GB of storage, so it’s nice to have these features officially confirmed by OnePlus. The company has long focused on making its flagship phones competitive with other flagships on the market, and it’s starting to look like the OnePlus 6 will be no different.
OnePlus has said that it’s aiming to launch the OnePlus 6 in late Q2 2018, so expect a few more teasers and spec drops like this in the coming weeks.
OnePlus CEO Pete Lau published a post earlier today entitled ‘Our Pursuit of Burdenless Speed.’ Most of it just discusses how OnePlus is committed to producing a smooth and powerful user experience, even creating a specialized team called Team FSE (Fast, Stable, Efficient) for this very effort.
With the 5T out of stock, the OnePlus 6 can’t be far from store shelves. Some say that it will be unveiled by the end of the month with shipments scheduled for a couple of weeks after that. Which raises the question of how much? OnePlus has a tendency to up the price of each new model and rumor has it that the 6 will be no exception. The rumored prices are as follows: $ 525 for the base 64GB model, $ 600 for the 128GB one and $ 700 for the top of the line 256GB model. Note that these are prices for China (converted to dollars), so American and Euro prices should be higher due to additional…
Nuvvyo's over-the-air Tablo DVRs are potentially big bargains if you want the convenience of recording shows without a pricey cable package, but the up front cost (dictated in part by the built-in storage) can make them daunting. The company has a si… Engadget RSS Feed
We’ve known since last year that Amazon Music was planning to shut down its dedicated cloud music locker. Now, we have a date for when that process will begin. In an email to Amazon Music users, the company says uploaded songs will be removed from a user’s library on April 30th, 2018. You can however keep any music in the cloud by proactively going to your Music Settings and clicking the “Keep my songs” button.
Back in December, Amazon stopped letting users upload new tracks to Music Storage, which holds up to 250 songs for free. The company said at the time that by January 2019, users wouldn’t be able to download or stream tracks they’ve uploaded to Music Storage, so it sounds like you’ll still have many months between April and next…
Pure Storage and NVIDIA have launched “AI in a box” for enterprise customers. Chris Middleton talks to Pure Storage CTO Alex McMullan about the strategy behind the team-up.
Flash storage provider, Pure Storage, and hardware giant NVIDIA have announced what they say is a state-of-the-art AI supercomputer ready to be slotted into a customer data centre.
AIRI, which the companies describe as “the industry’s first comprehensive, AI-ready infrastructure”, is designed to help organisations deploy artificial intelligence at scale, and speed time to insight.
The new converged-infrastructure appliance is essentially “AI in a box”, and is intended to provide an architecture that “empowers organisations with the data-centric infrastructure needed to harness the true power of AI”, according to a joint announcement from the companies.
What’s in the box?
The integrated hardware/software solution includes Pure Storage FlashBlade, a storage platform architected for analytics and AI, and four NVIDIA DGX-1 supercomputers, delivering “four petaflops of performance” via NVIDIA Tesla V100 GPUs.
The systems are interconnected with Arista 100GbE switches, supporting GPUDirect RDMA for maximum distributed performance. AIRI is also supported by the NVIDIA GPU Cloud deep-learning stack and Pure Storage’s new AIRI Scaling Toolkit.
All of this high-performance, optimised hardware will enable data scientists to “jumpstart their AI initiatives in hours, rather than weeks or months”, said the announcement.
As some sections of the media zero in on the perceived problems and ethical challenges associated with AI, Pure Storage stressed the social benefits of the technology.
“AI has fantastic potential for aiding humanity,” said Charles Giancarlo, CEO of Pure Storage. “It has the capacity to significantly improve the quality of all of our lives. AIRI will accelerate AI research, enabling innovators to more rapidly make advances to create a better world with data.”
That’s all very well, but how much does all this cost? On that point, Pure Storage and NVIDIA remained tight-lipped and pointed to their channel partners, but the specification of the hardware suggests this may be for enterprises with deep pockets. Perhaps AIRI can tell us.
The CTO speaks out
Pure Storage CTO Alex McMullan told Internet of Business that while AIRI is an “industry first”, the focus is on making AI “accessible to just about everyone”.
“If you can actually drop an AI supercomputer into a customer data centre in a couple of hours, then that’s a huge time-to-market benefit,” he said.
“We’ve worked with NVIDIA to make a high-end, state-of-the-art supercomputer available in 24 inches of data centre infrastructure, replacing what would otherwise be racks and racks of stuff.”
What was the main driver behind the idea? “This is a very data-driven world, with big data sets and data footprints,” explained McMullan. “And we have a whole separate thread here at Pure Storage about data gravity, and why that’s a challenge and concern for the industry.
“AI is really about data quality, it’s about data provenance, and it’s about having the right level of training data that is correctly tagged and indexed.
“But some of our existing customers who are doing machine learning have 200 or 300 people who do nothing but categorise and tag images and other data, because that’s what’s required in this space. They told us, ‘We spend a lot of time with wires and cables and boxes trying to plug all this stuff together, so wouldn’t it be great if…’
“That’s what started the conversation with ourselves and NVIDIA. We had a number of joint customers, but we thought it would make more sense to have a single offering.”
So does AIRI (AI-Ready Infrastructure) itself include AI software, or is the appliance optimised for other vendors’ solutions?
“It brings it up to a specific level where all the tool sets, libraries, and models that a data scientist would expect are installed on the platform. But if you have your own data sets and models you can certainly apply them,” said McMullan.
The public cloud problem
So what would the advantage be to an organisation of implementing an on-premise, appliance-based solution – as opposed to deploying something like Watson in the cloud or any other solution?
“For me the answer is it’s all about [the problem of] the public cloud,” said McMullan. “The public cloud has challenges with data gravity. For me, the public cloud is there to deliver agility and time to market, but it’s not there to deliver scale and cost efficiencies.
“What we see quite often is that many of our existing customers experiment, integrate, and develop in the public cloud at small scale, but once they have larger deployments and data sets, once they have bigger clusters, it always come back on premise, because that’s the most cost-effective way of deploying it.
For me something like Watson is very much a start, a small-scale, early-adopter technology, whereas the NVIDIA/Pure solution is very much an industrial scale behemoth.”
Customers speak out
AIRI is launching with three named customer partners onboard: outsourced call centre provider Global Response, AI business applications provider Element AI, and AI pathology specialist, Paige.AI.
Paige.AI aims to transform clinical diagnoses and oncology via the use of artificial intelligence. “With access to one of the world’s largest tumour pathology archives, we needed the most advanced deep learning infrastructure available to quickly turn massive amounts of data into clinically validated AI applications,” said Dr. Thomas Fuchs, founder and chief science officer of Paige.AI.
Meanwhile, Element AI, a platform for companies to build their own AI solutions, sees AIRI as an “accelerant” for complex projects. “AIRI represents an exciting breakthrough for AI adoption in the enterprise, shattering the barrier of infrastructure complexities and clearing the path to jumpstart any organisation’s AI initiative,” said Jeremy Barnes, chief architect at Element AI.
Finally, Global Response has begun development of a call centre system that allows for the real-time transcription and analysis of customer support calls. “We’ve reached an inflection point where integration of AI throughout our organisation is critical to the ongoing success of our business,” said Stephen Shooster, Co-CEO, Global Response.
“While we wanted to move quickly, the infrastructure for AI was slowing us down, because it is very complex to deploy. To truly operationalise AI at scale, we needed to build a simple foundation powerful enough to support the entire organisation.”
Internet of Business says
McMullan’s own background is in financial services technology, with stints at UBS and Barclays, along with Sun Microsystems and British Aerospace. However, he said that despite sectors such as financial services being in the vanguard of AI adoption, AI is really just about, “shovelling large amounts of data into a GPU engine”.
“The cardinality and the structure of that data doesn’t really matter. The chief thing is it’s best at finding trends, patterns, and outliers,” he said.
However, one thing that is increasingly important to AI adopters – and to legislators and regulators – is the question of AI’s transparency and ‘auditability’. Does an AI in a box make ‘showing the workings’ of AI easier, or harder?
“In terms of transparency, I don’t think this changes the equation,” said McMullan. “It’s still using the same native software tools, it’s simply allowing you to get to a result considerably faster and more reliably.
“However, I think that the combination of Pure Storage and NVIDIA allows you to have a much larger training set, which to me is the key foundation of any kind of machine-learning-based approach. The better and bigger your training set, the better the results you’re going to get.
“But the workings inside the box are still a software output, and I don’t think we’re there yet in terms of understanding the complete result based on the input.”