Skip to content

Archive for

12
Feb

Google Maps could soon let you share battery status along with location


In the not-so-distant future, you may be able to assure your friends and family that no terrible fate has befallen you simply because a terrible fate has befallen your phone (which is to say, it ran out of battery). A recent Android Police APK teardown of the Google Maps 9.71 beta suggests that the app will soon show your smartphone’s battery stats when you share your location, which could mean that when you let your loved ones know where you are, you can also let them know how much juice you have left in your phone. That could be the key to avoiding many panicky texts and voicemails.

As it stands, you can already share your location in Google Maps. You need only click the dropdown menu in the left-hand corner of the app, and determine how long and whom you’d like to update as to where you are. But as one might imagine, it gets pretty alarming when a contact suddenly drops off the map when his or her phone battery dies. Google looks to be offering a fix for this by allowing location viewers to also check out remaining battery status, though it likely won’t be terribly specific.

Rather, Android Police suggests, the basic format will read something along the lines of, ” [Person]’s battery level is between 50 percent and 75 percent and is charging.” If Maps is unable to determine battery level, either because permissions haven’t been granted or information is outdated, it will read, “[Person’s] battery level is unknown.”

Additionally, Maps looks to soon be adding the ability to create shortcuts to commonly visited mass transit stations, so you can figure out the fastest route to your nearest bus or subway stop in the midst of a pouring rain. In fact, Maps looks to be doing a mass transit overhaul in its entirety — Android Police notes that there are a number of lines that appear to be brand new, and while it’s unclear exactly what will change, our fingers are crossed for an improvement.

As ever, don’t hold your breath for any of these updates. It often takes quite some time for these new features to roll out, and it’s always possible that the features won’t come to fruition at all if testing proves unsuccessful.

Editors’ Recommendations

  • Best iPhone 7 battery cases to keep your phone above 20 percent
  • Get your gaming on the go with the 25 best Android games
  • How to use Nova Launcher to become an Android superstar
  • 35 helpful Galaxy S7 tips and tricks to master your Samsung smartphone
  • Yes, Apple is slowing down your old iPhone. But if you’re angry, you’re crazy


12
Feb

Find out what to expect from Android P in our extensive guide


The most recent version of Google Android, 8.0 Oreo, is slowly making its way out into the world, and toward the phone in your hand; but what is coming next? Google’s alphabetical naming system suggests it’ll be Android P, and here’s everything you need to know about it now.

Android P?

Google names its major Android versions after some kind of sweet, or dessert. The most recent is Oreo, and we’ve had everything from Marshmallows to Cupcakes in the past. What will the P stand for? It’s apparently being called Pistachio Ice Cream internally, but that’s not an indication of its final name, and Google has changed its mind at the last minute in the past. What do you think it should be called? Remember, it has to be something sweet, and start with the letter P.

Design

How will Android P look? It may look quite different to existing versions. Most notably, Android P is rumored to adopt the iPhone X’s screen notch. The iPhone X’s screen covers most of the phone’s front panel, apart from a controversial notch at the top, where the camera and various sensors are placed. Google may be building support for a similar screen layout into Android P, according to sources speaking to Bloomberg. While phones that run Android are expected to arrive with a screen notch before Android P is released, they will operate a custom version of Android modified by the manufacturer, much like how the Essential phone deals with the layout now.

The adoption of the screen notch is potentially just a small part of a major redesign for Android. Google wants to attract people who choose the iPhone over an Android device by “improving the look of the software,” sources told Bloomberg. The reception to Apple’s notch on the iPhone X was mixed, and many Android users prefer the deeper customization available with Google’s software over Apple’s iOS, so bringing it closer in line with Apple’s design choices may prove controversial.

The notch may be just one new device design Google will encourage with Android P. Because manufacturers design their own hardware, many use different looks to separate their devices from others. Android P may also support designs with foldable, or flexible screens. Like the notch, Android phones with these designs have already been seen, and we expect to see more in 2018 even without official support for the design from Google. Instead, custom Android interfaces will be put to use, much like the ZTE Axon M.

Google Assistant

Google Assistant is already a key part of Android. However, Google has made it clear over the past few months that Assistant, and voice control in general, is only just getting started. Google may make Assistant even more prominent in Android P, according to a Bloomberg report, which says engineers are considering placing Assistant inside the Google search bar on the main home screen.

Additionally, developers may be able to exploit Assistant’s voice technology inside their apps. It is not clear whether this will be widely spread, or if Google will choose specific partners to utilize Assistant’s talents. The report also says neither of these two features is final, and Google may not include them in the final version of Android P.

Other new features

If Google concentrates on a new look for Android P, will the rest of the software remain similar to Android Oreo? It’s not clear yet, but more improvements to device battery life should be expected.

When will it be announced?

Google often gives us a quick look at the next version of Android during its Google I/O developer conference. We hope 2018 will be no different. Google I/O 2018 is scheduled to begin on May 8, and we’re most likely to hear about Android P during the keynote held on the first day.

When the software will be ready for your phone is a very different story. Developer previews follow the announcement and continue until a final release around September or October, traditionally. The software usually debuts on new Google-produced smartphones in the Pixel range. If you own a phone from a different manufacturer, it will come at a later date.

Editors’ Recommendations

  • Everything you need to know about Android 8.0 Oreo
  • When is your phone getting Android 8.0 Oreo? We asked every major manufacturer
  • Here’s everything you need to know about the Huawei P20
  • OnePlus 5T vs. Pixel 2: Battle of the Android superstars
  • Sharp’s new iPhone 5c copy is called the Android One S3, runs stock Android


12
Feb

The $249 Asus Chromebook Flip can download and play apps via Google Play


Time for something new in your life.

The Asus 10.1-inch Chromebook Flip is down to $249 on Amazon. It normally sells for around $270 and has recently been selling as high as $300. This drop matches what it sold for on Black Friday and has hit a couple times since. It’s the lowest price ever and a great deal.

asus-chromebook-flip-c943.png?itok=vy2sq

The Flip features a flexible 360-degree hinge and a touchscreen, allowing it to be used in either tablet, stand or laptop mode. Housed in a sleek, aluminum metal body, it weighs just two pounds and is only .6 of an inch thick. Its battery can last for up to nine hours on a single charge.

It has 4GB RAM and 16GB of storage, though there’s also a microSD card port allowing you to beef up how much storage it has. You can pick up one of these 64GB memory cards for only $23 with the money you save. You’ll definitely want one considering this computer has access to the Google Play store, allowing you to easily download tons of apps and games straight to the device as if it were an Android smartphone.

See on Amazon

12
Feb

How to take a screenshot on a Chromebook


Here’s how to take a screenshot on your Chromebook!

Sometimes, you just need to show someone else the thing that’s on your screen. In this case, taking a screenshot is the easiest thing to do. Here’s how to take a screenshot on a Chromebook!

Use the keyboard

chromebook-window-switch-key.jpg?itok=jT The window switch key on a Chromebook.

Every Chromebook has a keyboard, and taking a screenshot with the keyboard can be done in a couple ways.

  • To capture your entire screen, hit Ctrl + window switch key.
  • To capture only part of the screen, hit Ctrl + Shift + window switch key, then click and drag your cursor to select the area you’d like to capture.

Note: If you’re using an external keyboard, the combinations will be Ctrl + F5 and Ctrl + Shift + F5, respectively.

Use the side buttons

pixelbook-side.jpg?itok=zkg4BaaQ

If you’re using your Chromebook in tablet mode, it’s pretty inconvenient to swing the keyboard back around just to take a screenshot. Fortunately, there’s an easier way: just like on an Android phone, you can press power + volume down to take a screenshot.

This only works to capture your entire screen, so if you need a cropped section, you’ll still need to use the keyboard. Keep in mind that the orientation of the volume buttons don’t change with the orientation of your screen, so you’ll need to remember which side is volume down.

Use an extension

nimbus-screenshot.jpg?itok=99xYhMED

While the built-in methods above will work for most people, some users may need a bit more power. In this case, an extension like Nimbus Screenshot & Screen Video Recorder may be worth checking out. As the name implies, Nimbus lets you do a simple screenshot, but also includes the ability to record a video of your screen or delay the capture if you need to screenshot a particular menu.

Download: Nimbus Screenshot & Screen Video Recorder in the Chrome Web Store (free)

Be sure to back it up

By default, your screenshots will be locally saved to the Downloads folder on your Chromebook.

In there, you can open the screenshot to crop it, add a filter, and access other basic editing options. If you want to back the screenshot up permanently, you can upload it to Google Photos or just back it up to Google Drive.

Chromebooks

  • The best Chromebooks
  • Should you buy a Chromebook?
  • Google Play is coming to Chromebooks
  • Acer Chromebook 14 review
  • Join our Chromebook forums

12
Feb

Verizon will begin locking phones to its network this spring


You’ll soon have to wait a hot minute before taking your Verizon phone to another carrier.

In the United States, most all carriers require you wait a certain amount of time after buying a phone before you can use it on another network. It’s a process called “locking”, and it’s something that’s instituted by AT&T, Sprint, and T-Mobile. Beginning at some point this spring, Verizon will follow suit.

new-verizon-logo-banner-2.jpg?itok=Ge9Ck

With Verizon’s current policy of selling unlocked phones, you can purchase any device from the carrier and instantly start using it on another network that supports it. It makes it easy for consumers to take their devices to other providers as they see fit, but according to Verizon, it also makes it easier for crooks to steal these devices and sell them on the black market.

Verizon’s following the industry norm.

Once Verizon changes this, you’ll need to wait for an undisclosed amount of time after buying a phone before taking it somewhere else. The carrier hasn’t said how long this waiting period will be, but if its similar to what we already have, we’ll likely be looking at anywhere from 15 – 60 days.

I don’t buy into the idea that something like this helps to cut down on theft, and instead feel that Verizon is making this change as a way to deter customers from buying its phones and using them on a competitor’s network. It can certainly be seen as a hostile act towards consumers, but from a business point-of-view, it makes sense for Verizon. The rest of the industry already does this, and if it can keep more people on its network in the process, it might as well.

There’s no timeframe as to when these changes will be made aside from “this spring”, so if you were planning on buying a Verizon phone and using it elsewhere, I’d suggest doing so soon before this goes into effect.

Android P will feature a ‘dramatic redesign’ and support notch displays

12
Feb

Amazon said to be making its own AI chip for faster Alexa performance


A dedicated chip would allow Alexa to perform speech recognition without contacting the almighty cloud.

Amazon Alexa is undoubtedly one of the leaders in the virtual assistant space, and while I’ve never found it to be slow on devices like the Echo, a new report suggests that it may get even faster thanks to a new AI chipset.

amazon-echo-plus-review-2017-6.jpg?itok=

According to The Information, Amazon is in the process of building a dedicated hardware chip to be used with products like the Echo, Echo Show, and other Alexa-powered speakers so that they can natively process speech recognition without having to first contact the cloud. While Alexa would still need to use the cloud to pull in information and connect with skills, the chip would allow for faster responses for small commands – such as checking the time, setting an alarm, etc.

This would be a big step for Alexa, but the news is hardly surprising. Both Google and Apple already use AI chipsets for a variety of their products/services, and it was only a matter of time Amazon would follow suit – especially since the company purchased chip-designer Annapurna Labs in 2015.

We don’t have a timeframe as to when Amazon will start integrating its AI chip into Echo hardware, but this is certainly something we’ll be keeping our eyes out for in the future. Stay tuned.

Amazon Echo

  • Tap, Echo or Dot: The ultimate Alexa question
  • All about Alexa Skills
  • Amazon Echo review
  • Echo Dot review
  • Echo Spot review
  • Top Echo Tips & Tricks
  • Amazon Echo vs. Google Home
  • Get the latest Alexa news

See at Amazon

12
Feb

AI facial analysis demonstrates both racial and gender bias


Researchers from MIT and Stanford University found that that three different facial analysis programs demonstrate both gender and skin color biases. The full article will be presented at the Conference on Fairness, Accountability, and Transparency later this month.

Specifically, the team looked at the accuracy rates of facial recognition as broken down by gender and race. “Researchers at a major U.S. technology company claimed an accuracy rate of more than 97 percent for a face-recognition system they’d designed. But the data set used to assess its performance was more than 77 percent male and more than 83 percent white.” This narrow test base results in a higher error rate for anyone who isn’t white or male.

In order to test these systems, MIT researcher Joy Buolamwini collected over 1,200 images that contained a greater proportion of women and people of color and coded skin color based on the Fitzpatrick scale of skin tones, in consultation with a dermatologic surgeon. After this, Buolamwini tested the facial recognition systems with her new data set.

The results were stark in terms of gender classification. “For darker-skinned women . . . the error rates were 20.8 percent, 34.5 percent, and 34.7,” the release says. “But with two of the systems, the error rates for the darkest-skinned women in the data set . . . were worse still: 46.5 percent and 46.8 percent. Essentially, for those women, the system might as well have been guessing gender at random.”

There have certainly been accusations of bias in tech algorithms previously, and it’s well known that facial recognition systems often do not work as well on darker skin tones. Even with that knowledge, these figures are staggering, and it’s important that companies who work on this kind of software take into account the breadth of diversity that exists in their user base, rather than limiting themselves to the white men that often dominate their workforces.

Source: MIT

12
Feb

Trump’s new budget won’t land us on the moon anytime soon


Vice President Pence may have vowed that the US would return astronauts to the moon, but it looks like we aren’t getting there anytime soon. Ars Technica appears to have received an early copy of the White House’s fiscal year 2019 budget set for release at 1 PM ET today. The “Lunar Exploration Campaign” outlined in the document only provides for incremental steps to return astronauts to the moon’s surface, rather than a renewed focus on exploration.

The administration’s focus on the moon, rather than Mars (or some other, more scientifically relevant goal), was met with some grumbling in the space community. Now it appears that it isn’t even a top priority; Ars Technica reports that NASA will work on unmanned landers to the lunar surface between now and 2023. It’s only after that NASA would begin work on a lander that would return humans to the surface, and the document has no projected date for when that would happen (but it would be well after 2024). Until then, our lunar presence will be orbital only. Quartz reports that the first stage of the orbital moon outpost will launch by 2020.

The document basically supports NASA’s muddled status quo when it comes to human exploration of space, without providing much direction or the aggressive funding that the administration’s rhetoric seemed to promise. The Orion program and SLS (the mega rocket that will dwarf even SpaceX’s Falcon Heavy) will also continue to receive funding at an increased rate.

As expected, the document also ends funding to the International Space Station by 2025. According to Quartz, the budget allows $150 million in funding to ensure that private companies are able to either take over the space station or create their own habitats in low Earth orbit. The problem here is the small amount of time private companies would have to prepare to do that. It’s unclear at this point whether it would be enough, or whether private companies even hold sufficient interest in the ISS to support it.

Source: Ars Technica, Quartz

12
Feb

The ‘Spire’ portable recording studio is all about spontaneity


Inspiration hits you at weird times — usually when you’re in the shower or about to fall asleep. Basically, moments when you have very little chance of putting it down on paper or recording it. The $350 portable Spire Studio from iZotope probably won’t help much in those situations (unless you keep it next to your bed with a guitar or keyboard). But, for folks who write music, it’s a surprisingly simple to use 8-track recorder that hooks up to almost all your gear.

While computers have democratized music recording, they can be cumbersome when you just want to get an idea down. You have to deal with cable adaptors, varying sound levels and complex software. The Spire speeds things up by shoving the necessary audio ports and hardware into a… well, a stubby “spire” and connecting it to a robust but easy-to-use companion app (over WiFi).

It all starts with industry-standard audio inputs. Connecting my synth, guitar, bass or drum machine to the Spire via one of the two XLR/TOS combo ports was as easy as plugging them into a soundboard. I didn’t have to hunt down dongles or adaptors like I would with my computer. The device’s four-hour battery life also meant I didn’t have to dig up an extension cord to record in my living room.

To keep audio levels from being all over the map, the Spire has a “Sound Check” button on the device and in the app. Press the button and play an instrument (or sing into a mic) and the Spire will find a level that’s appropriate for that instrument. My tests with different guitars, bass, keyboards and synths all produced solid audio levels adequate for mixing.

Playing, recording and mixing those normalized tracks only requires a few taps in the companion app. Each one is color-coded, so determining which audio is which a cinch. That said, a quick naming convention would be helpful, especially if you’re dealing with more than one track of the same instrument.

Before recording each track there are virtual amps and “spaces” (acoustic recreations of rooms) that can be added for deeper and more robust sounds. This is old hat for iZotope. The company has a long history of audio manipulation via software. Each of these effects can be fine-tuned to your liking. The amps are for guitar and bass primarily but can be applied to any instrument attached to the Spire.

If you happen to connect two instruments at once, the app automatically produces two new tracks to record on. This makes singing while playing guitar or playing with another band member as easy as playing a single instrument on your own. There’s also a built-in mic that’ll work in a pinch for singing or acoustic guitar. The quality is impressive, but not nearly as nice as plugging in the instrument or using a high-end microphone.

When you’re done, “mixing” is just a matter of moving your tracks in a virtual space. Higher in the space for louder and left and right for stereo mixing. It’s easy to adjust all your tracks on the fly and hear in real time how it affects the finished song. Like recording, iZotope took something that can be a huge pain and maked it visual and easy to grasp. I just wish that some of the apps slick design translated over to the actual hardware.

My biggest beef with the Spire is its circular form. It doesn’t come with a carrying case and the stubby shape makes it difficult to find a case it’ll fit in. A rectangle or square might not look as cool, but it would be a design that’s easier to shove into a gig bag. The lights on the top are nice visual indicators of what’s happening, but it really doesn’t need to be a circle for that system to work.

Yet, even with its odd shape and slightly high price point, the Spire is an outstanding piece of equipment for musicians. If you’re someone who misses the spontaneity of the old tape-based 4- and 8-track recorders, and you have a bag it’ll fit into, the Spire is worth checking out.

12
Feb

Amazon is reportedly designing AI chips to improve Alexa


Amazon has begun designing its own AI chips, according to an exclusive report from The Information. The hardware is designed for anything powered by Alexa, including the Echo, and would allow the virtual assistant to respond more quickly by adding speech recognition directly to the device.

Right now, whenever a user makes an inquiry on an Alexa-powered device, there is a delay while the virtual assistant contacts the cloud in order to interpret the request. While Echo devices would continue to rely on the cloud for complex inquiries, adding speech recognition directly would allow Alexa to perform simple tasks, such as checking the time, without that cloud delay.

Amazon acquired chip designer Annapurna Labs back in 2015, and has slowly begun churning out its own processors. It was only a matter of time before it started designing and producing chips specifically for its own hardware needs. The company has also begin hiring chip engineers for Amazon Web Services, signaling that it may be moving to its own proprietary chips for these data centers as well.

It should be noted that Google and Apple have both designed their own AI chips, and Google also is using its own chips to support services such as Street View, Photos, Search and Translate. Amazon is just the latest company to go down this route, though it should be noted that just because the company is reportedly designing these chips does not mean it will achieve the performance from them that it desires.

Source: The Information