Scientists Claim They’ve Developed a Male Birth Control Pill — Again

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

Certain breakthroughs always seem just out of scientists’ reach.

Warp drive. Scalable fusion reactors.

And, of course, a male birth control pill.

This week, yet another team of researchers raised the hopes of reproductively responsible men everywhere claiming they’d developed a safe and effective once-a-day male birth control pill.

However, guys shouldn’t toss their condoms just yet. While this drug seems promising, it’s still a long way from the local pharmacy.

Stephanie Page, a professor of medicine at the University of Washington, presented her team’s research into the male birth control pill, called dimethandrolone undecanoate (DMAU), at the annual Endocrine Society meeting in Chicago.

male birth control pill
Image Credit: frolicsomepl / Pixabay

Once a day, for 28 days, each of the 100 men between age 18 and 50 ingested either a placebo or DMAU in one of three doses. On the first and last days of the study, each man gave blood samples so the researchers could determine his hormone and cholesterol levels.

According to the study, the men who took the highest dose, 400 mg, showed a “marked suppression” of testosterone levels, as well as the levels of two hormones needed to produce sperm. The researchers claim these hormone responses are “consistent with effective contraception.” That is, it would probably work as birth control.

Every subject in the trial passed all safety tests, and very few reported any symptoms traditionally linked to too much or too little testosterone, Page said in a press release. They had problems with sexual function and no mood changes, either, she noted during the presentation.

However, each man taking DMAU did gain weight and had lower levels of HDL cholesterol (that’s the “good” kind).

This isn’t the first experimental male contraceptive to have these side effects. Typically, drugs like these have two major problems: the oral testosterone they contain damages the liver, and the drugs leave the body too quickly — men would need to take the pills at least twice a day for them to be effective.

DMAU actually addresses those issues. To the first point: the dimethandrolone in DMAU is a testosterone modified to eliminate liver toxicity. And the second: the long-chain fatty acid undecanoate ensures the drug stays in the user’s system for a full 24 hours.

So, that’s the good news. The bad news is this was a super small study. 100 men, reduced to 83 by the end of the brief 28 days of the study, is hardly grounds for an FDA approval. Still, Page claims the team is currently conducting longer-term DMAU studies.

Even more reason to take the findings with a grain of salt: the team’s research has not yet been published in a peer-reviewed journal. Until other members of the scientific community have a chance to pick apart the study and verify its methods and conclusions, DMAU will remain just another in the long list of potential male birth control pills.

The post Scientists Claim They’ve Developed a Male Birth Control Pill — Again appeared first on Futurism.

Futurism

Cash For Apps: Make money with android app

Facebook Lite reaches the US, UK, and other developed countries

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

The Facebook Lite app for Android has amassed more than 200 million users since its launch in 2015. It was created to provide a better experience on lower-end devices with slow data connections in developing markets, just like Messenger Lite. And also like the company’s chat app, Facebook Lite is now headed to some developed markets as well. Expect to see it become available to download in the Play Store today. That is, if you are in the US, UK, Canada, Australia, France, Germany, Ireland, or New Zealand. “We’ve seen that even in some developed markets people can have lower…

GSMArena.com – Latest articles

Cash For Apps: Make money with android app

Newly Developed Microneedles Can Dissolve in the Skin to Deliver Drugs

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

A fear of needles may soon be a thing of the past, as a team at the University of Texas at Dallas has come up with a painless alternative to conventional syringes.

With the new method, drugs would still be injected into the body, but using a microneedle that patients can’t even feel. The needle is so thin that it would break off under the skin dissolving as it releases the intended substance over time. Although the system wouldn’t work for every kind of medication, ultra thin needles are able to deliver a variety of drugs, provided they are made of small molecules.

The team’s research, published in ChemRxiv, explains that the needles are made using a 3D printing technique known as “fused deposition modeling”, and polylactic acid, a renewable, biodegradable, thermoplastic material that’s been approved for use by the U.S. Food and Drug Administration. To achieve the tapered shape, the needles are chemically etched after the printing process. The team is able to create microneedles with tips as small as 1 micrometer or one millionth of a meter, and different needle shapes with a width of 400-600 micrometers. For comparison, a human red blood cell is about 5 micrometers wide.

While hypodermic needles are the standard for giving injections, they can be painful, and leave ugly bruises if not handled with precision. They also produce biohazardous waste. The new microneedles would solve all these problems at once: not only do they make injection painless, and can theoretically be used by just about anyone, but because the needle dissolves under the skin as it delivers the drug, it doesn’t create waste.

One setback to the concept is that while the actual needles can be produced cheaply, the initial design requires expensive photolithographic equipment. However, early results are encouraging: The finished microneedles have been successfully tested on parafilm and pig skin to evaluate their ability to puncture and break off as intended. And according to Cosmos Magazine, the researchers found that applying sideways force worked well, resulting in 84 percent of needles breaking off.

Research into alternative ways to deliver drugs has become increasingly popular in recent years. Last June, scientists from Georgia Tech and Emory University developed a sticker patch equipped with microneedles that could be used to deliver vaccines. In February, researchers from the University of Copenhagen released a study suggesting edible QR codes could be the next step in drug delivery.

Regardless of where the medical community goes, it’s safe to say that patients around the world would welcome painless injections, or no injections at all. Imagine a world in which parents no longer have to convince their children that the scary-looking needle isn’t that bad, even when it is, in fact, terrifying to look at.

The post Newly Developed Microneedles Can Dissolve in the Skin to Deliver Drugs appeared first on Futurism.

Futurism

Cash For Apps: Make money with android app

AI camera tech in LG V30S was developed with EyeEm, which beat Google last year

LG launched its own artificial intelligence platform – ThinkQ – and it brought it to phones with the new LG V30S ThinkQ. Among its new features is AI CAM, a layer of artificial smarts on top of the camera viewfinder. It recognizes what’s in the scene and adjusts the camera settings accordingly. AI CAM was done in collaboration with EyeEm and its Vision API. The company’s app was recently featured on Google’s Android Excellence list, but there’s more to it than that – EyeEm’s image recognition is better than Google’s. A benchmark performed last year graded multiple algorithms on how…

GSMArena.com – Latest articles

Apple details how it developed the iPhone X’s ‘Portrait Lighting’ feature

iPhone X Camera

While the iPhone today is capable of taking absolutely stunning and arguably best-in-class photos, that wasn’t always the case. Going back in time a bit, the original iPhone camera was nothing special. In fact, the original iPhone camera wasn’t even respectable relative to what other smartphone manufacturers were releasing at the time. Over time, though, Apple began to invest more in camera technologies and it wasn’t long before new iPhone models were routinely setting new standards of excellence for mobile photography.

In recent years, Apple has undeniably taken mobile photography to the next level, thanks in large part to the company’s control of both the iPhone hardware and software. Most recently, Apple released a camera feature it dubs Portrait Lighting. The feature itself takes advantage of the iPhone’s dual camera scheme and enables users to take photos designed to resemble professional shots that would ordinarily be taken in a studio environment.

Currently available on the iPhone X, iPhone 8 Plus, and the iPhone 7 Plus, Portrait Lighting in iOS 11 is truly remarkable. And speaking to the lengths Apple went to perfect it, the company earlier today released a new ad which details how the feature was brought into existence.

As detailed in the video below, Apple worked with global image makers and top photographers in order to get the lighting effects just right. What’s more, Apple implemented advanced machine learning concepts to help create “studio-quality portraits.”

Apple has always been a company known to sweat the small stuff, and the video above is just yet another example of the company’s unrivaled attention to detail.

Apple – BGR

Researchers Have Developed a Potential Blood Test for Autism

First of Their Kind

Researchers at the University of Warwick have developed two tests that could potentially detect autism in children. Both tests, one blood, and one urine are based on a previously discovered link between damage to proteins in blood plasma and autism. The team believes the tests to be the first of their kind, and hope that they could help improve early detection of autism spectrum disorders (ASD).

The study, published in the journal Molecular Autism, confirmed previous research that had linked certain mutations in amino acid transporters with ASD. Since proteins in blood plasma can be damaged by two processes, oxidation and glycation, and the researchers developed tests that can detect that damage.

Armed with this knowledge and using the most reliable of the tests they developed, the team took urine and blood samples from 38 children with ASD, as well as a control group of 31 children who had not been diagnosed with ASD. With the help of an artificial intelligence (AI)-developed algorithm, the team figured out how the two groups were chemically different.

“With further testing we may reveal specific plasma and urinary profiles or “fingerprints” of compounds with damaging modifications,”, Reader of Experimental Systems Biology at the University of Warwick and the research team’s lead. “This may help us improve the diagnosis of ASD and point the way to new causes of ASD.”

Researchers still do not completely understand why people develop autism. About 30-35% of cases of ASD are linked to genetic variants, but there is no exact formula for predicting autism. As with many other conditions, genetics, environment, and other factors all play a role. In recent years, there’s even been evidence proposed that gut bacteria could indicate whether or not a person has an ASD.

Finding biomarkers for ASD wouldn’t be far off what the team from Warwick has accomplished, as their research demonstrated that measuring protein damage could be a reliable indicator of whether or not a child has ASD.

“Our discovery could lead to earlier diagnosis and intervention,” said Rabbani, “We hope the tests will also reveal new causative factors.”

ASD cases are characterized by a wide variety of symptoms that can range from mild behavioral issues to debilitating compulsive behavior, anxiety, cognitive impairment, and much more. Because its symptoms are so varied and the causes aren’t yet fully understand, diagnosis and treatment can be an arduous journey.

If tests can be developed that allow families to receive a diagnosis sooner, it will give them the ability to seek intervention earlier, too. Which can be essential for helping kids with ASD, and their families, navigate the world and improve their quality of life.

The post Researchers Have Developed a Potential Blood Test for Autism appeared first on Futurism.

Futurism

Brains on a battery: Low-power neural net developed, phones could follow

Low-power neural network developed

Researchers at MIT have paved the way to low-power neural networks that can run on devices such as smartphones and household appliances. Andrew Hobbs explains why this could be so important for connected applications and businesses.

Many scientific breakthroughs are built on concepts found in nature – so-called bio-inspiration – such as the use of synthetic muscle in soft-robotics.

Neural networks are one example of this. They depart from standard approaches to computing by mimicking the human brain. Usually, a large network of neurons is developed, without task-specific programming. This can learn from labelled training data, and apply those lessons to future data sets, gradually improving in performance.

For example, a neural network may be fed a set of images labelled ‘cats’ and from that be able to identify cats in other images, without being told what the defining traits of a cat might be.

But there’s a problem. The neurons are linked to one another, much like synapses in our own brains. These nodes and connections typically have a weight associated with them that adjusts as the network learns, affecting the strength of the signal output and, by extension, the final sum.

As a result, constantly transmitting a signal and passing data across this huge network of nodes requires large amounts of energy, making neural nets unsuited to battery-powered devices, such as smartphones.

As a result, neural network applications such as speech- and face-recognition programs have long relied on external servers to process the data that has been relayed to them, which is itself an energy-intensive process. Even in humanoid robotics, the only route to satisfactory natural language processing has been via services such as IBM’s Watson in the cloud.

A new neural network

All that is set to change, however. Researchers at Massachusetts Institute of Technology (MITT) have developed a chip that increases the speed of neural network computations by three to seven times, while cutting power consumption by up to 95 percent.

This opens up the potential for smart home and mobile devices to host neural networks natively.

“The general processor model is that there is a memory in some part of the chip, and there is a processor in another part of the chip, and you move the data back and forth between them when you do these computations,” MIT News reports, in an interview with Avishek Biswas, MIT graduate student in electrical engineering and computer science, who led the chip’s development.

Traditionally, neural networks consist of layers of nodes that pass data upwards, one to the next. Each node will multiply the data it receives by the weight of the relevant connection. The outcome of this process is known as a dot product.

“Since these machine-learning algorithms need so many computations, this transferring back and forth of data is the dominant portion of the energy consumption,” said MIT Biswas.

“But the computation these algorithms do can be simplified to one specific operation, the dot product. Our approach was, can we implement this dot-product functionality inside the memory, so that you don’t need to transfer this data back and forth?”

A mind for maths

This process will sometimes occur across millions of nodes. Given that each node weight is stored in memory, this amounts to enormous quantities of data to transfer.

In a human brain, synapses connect whole bundles of neurons, rather than individual nodes. The electrochemical signals that pass across these synapses are modulated to alter the information transmitted.

The MIT chip mimics this process more closely by calculating dot products for 16 nodes at a time. These combined voltages are then converted to a digital signal and stored for further processing, drastically reducing the number of data calls on the memory.

While many networks have numerous possible weights, this new system operates with just two: 1 and -1. This binary system act as a switch within the memory itself, simply closing or opening a circuit. While this seemingly reduces the accuracy of the network, the reality is just a two to three percent loss – perfectly acceptable for many workloads.

Internet of Business says

At a time when edge computing is gaining traction, the ability to bring neural network computation out of the cloud and into everyday devices is an exciting prospect.

We’re still uncovering the vast potential of neural networks, but they’re undoubtedly relevant to mobile devices. We’ve recently seen their ability to predict health risks in fitness trackers, such as Fitbit and Apple Watch.

By allowing this kind of work to take place on mobile devices and wearables – as well as other tasks, such as image classification and language processing – there is huge scope to reduce energy usage.

MIT’s findings also open the door to more complex networks in the future, without having to worry so much about spiralling computational and energy costs.

However, the far-reaching power of abstraction inherent in neural networks comes at the cost of transparency. Their methods may be opaque – so called black box solutions – and we expose ourselves to both the prejudices and the restrictions that may come with limited machine learning models. Not to mention any training data that replicates human bias.

Of course, the same problems, lack of transparency, and bias be found in people too, and we audit companies without having to understand how any individual’s synapses are firing.

But the lesson here is that, when the outcome has significant implications, neural networks should be used alongside more transparent models, where methods can be held to account. Just as critical human decision-making processes must adhere to rules and regulations.

The post Brains on a battery: Low-power neural net developed, phones could follow appeared first on Internet of Business.

Internet of Business

Human Eggs Developed to Maturity in the Lab for the First Time

For the first time, scientists have successfully taken human eggs from their earliest stages to maturity in a lab setting. This accomplishment is set to give us new insight into how human eggs develop, and it could potentially offer a compelling new option to individuals who are at risk of fertility loss.

For the study, researchers at the University of Edinburgh took ovarian tissue from 10 people in their late 20s and 30s. Using various nutrients, they encouraged eggs to develop to maturity, the point at which they could be fertilized. A total of 48 eggs reached the final stage of the process, and of those, nine reached full maturity.

Currently, individuals at risk of infertility due to radiotherapy or chemotherapy can have ovarian tissue removed ahead of treatment and re-implanted at a later date. For young people who haven’t yet gone through puberty and aren’t yet producing eggs, this is the only option for preserving fertility, Evelyn Telfer, co-author of the research, told The Guardian.

That process raises concerns that re-implanting tissue taken prior to cancer treatment might reintroduce cancer cells into an individual’s body. The new procedure alleviates those concerns because instead of implanting tissue, the doctor would implant an embryo, according to Telfer.

Researchers still have much more work to do before this procedure could be used in practice. At the very least, it will take a number of years to ensure that the mature eggs produced are healthy.

According to the researchers, the eggs they grew developed faster than they would have in the body, which begs further investigation. Moreover, a small cell known as a polar body grew to an unusually large size during the process, which could indicate developmental abnormalities. The team wants to attempt to fertilize the eggs, so it can perform tests on the embryos.

Still, this is a major milestone in fertility research, and it could give new hope to those who may not have had any before.

The post Human Eggs Developed to Maturity in the Lab for the First Time appeared first on Futurism.

Futurism

Google Assistant now understands Hindi, Actions on Google can be developed in Russian (ru-RU)

Google Assistant’s language support is beyond confusing. I’ve been covering it for over 6 months now and I still don’t understand why a language could work in one version of Assistant but not another. Take Hindi for example. It works in Allo, but not in other instances of Assistant like the main one on your phone. That’s changing though now.

If you have your phone’s language set to English – India, you’ll be able to activate Assistant by tapping and holding on the Home button on any Android 5.0 and above phone (tablets probably won’t work as they only support US English for now).

Read More

Google Assistant now understands Hindi, Actions on Google can be developed in Russian (ru-RU) was written by the awesome team at Android Police.

Android Police – Android news, reviews, apps, games, phones, tablets

NASA Has Developed Autonomous Space Navigation That Uses Pulsars

X-Ray Navigation

NASA may have just improved our potential for deep space exploration by inventing a new type of autonomous space navigation. Known as Station Explorer for X-Ray Timing and Navigation Technology, or SEXTANT, the technology uses pulsars — rotating neutron stars that emit electromagnetic radiation — to determine the location of objects in space.

The way SEXTANT uses pulsars has been compared to how GPS navigation can provide drivers with positioning and accurate navigation using satellites orbiting around Earth. The pulsars SEXTANT uses are best observed in the X-ray spectrum, in which their beams of radiation essentially turn them into lighthouses.

To show that SEXTANT is an idea worth building on, a team of NASA engineers demonstrated the technology’s ability to locate NASA’s Neutron-star Interior Composition Explorer, or NICER. NICER — an observatory roughly the size of a washing machine — is currently orbiting Earth while attached to the International Space Station. It has been tasked with studying both neutron stars and pulsars, making it the perfect partner for SEXTANT’s first experiment.

An illustration of NICER attached to the International Space Station. Image Credit: NASA
An illustration of NICER attached to the International Space Station. Image Credit: NASA

“This demonstration is a breakthrough for future deep space exploration,” said Jason Mitchell, SEXTANT Project Manager in a NASA press release. “As the first to demonstrate X-ray navigation fully autonomously and in real-time in space, we are now leading the way.”

During November, NASA directed NICER to take readings from four specific pulsars using its 52 X-ray telescopes and silicon-drift detectors over two days. NICER then fed the information it got from the pulsars to SEXTANT. Within eight hours, SEXTANT was able to autonomously determine NICER’s location in Earth’s orbit within a 10-mile radius. SEXTANT’s readings were compared to NICER’s own onboard GPS receiver, confirming its accuracy.

“This was much faster than the two weeks we allotted for the experiment,” said SEXTANT System Architect Luke Winternitz in the press release. “We had indications that our system would work, but the weekend experiment finally demonstrated the system’s ability to work autonomously.”

Navigating Deep Space

SEXTANT is far from being complete, however, and NASA predicts it will be several years before a better version autonomous space navigation comes along. When it does, the tech will fill a huge need for space exploration. While GPS is fine for Earth and low-Earth orbit, its signal weakens the further away an object is from GPS satellites. As such, NASA’s X-ray navigation will be required for spacecraft sent far beyond Earth.

“This successful demonstration firmly establishes the viability of X-ray pulsar navigation as a new autonomous navigation capability,” Mitchell added in the press release. “We have shown that a mature version of this technology could enhance deep-space exploration anywhere within the solar system and beyond.”

With the initial experiment out of the way, NASA intends to improve the system’s flight and ground software for a second demonstration scheduled for later this year. Before SEXTANT can be considered for full-scale operations, however, NASA engineers must increase the sensitivity of its instruments while at the same time decreasing its size, weight, and power consumption.

NASA believes the autonomous space navigation could eventually be used during human spaceflight missions, or calculate position if used on missions to Jupiter, Saturn, or their respective moons.

The post NASA Has Developed Autonomous Space Navigation That Uses Pulsars appeared first on Futurism.

Futurism