Skip to content

Archive for

15
Aug

Google adds adds 30 languages and ‘voice-to-emoji’ transcription to Voice Typing


Why it matters to you

If Google Translate didn’t support your language before, chances are it does now. The latest update covers more than a billion people.

More than 500 million people around the world use Google to translate words and phrases, but the Mountain View company is nothing if not ambitious. On Monday, it announced the addition of 30 languages and locales to its suite of web- and phone-based translation apps, bringing the total number of supported languages to 119.

Google says the expanded support covers more than one billion speakers.

The additions include Georgian, African dialects of Swahili and Amharic, and Indian languages like Urdu, Tamil, Malayalam, and Gujarati. Google said it worked with native speakers to collect speech samples, training machine learning models to understand the languages’ nuances.

“[We asked] them to read common phrases,” Daan van Esch, a technical program manager at Google’s Speech division, said in a blog post. “This process trained our machine learning models to understand the sounds and words of the new languages and to improve their accuracy when exposed to more examples over time.”

The languages are available across Google’s range of translation apps, including Translate and Gboard. In the case of the latter keyboard app, though, users have to manually enable the language via the apps’ Voice Typing and Voice Search settings.

Here’s how:

  • Install Gboard from the Play Store. Head over to Settings > Languages and Input > Virtual Keyboard > Gboard. You can also access these settings from the keyboard by pressing on the G icon, and selecting the Settings wheel.
  • Tap on Voice Typing > Languages to add more languages.
  • To voice type in Gboard, tap the mic icon on the upper right when the keyboard is open.
  • Next, open the Google app.
  • Tap the three lines on the top left and go to Settings. Tap Voice > Languages and add the ones you want.

New languages aren’t the only thing heading to Google’s speech recognition. Starting Monday, Gboard is gaining support for “emoji-to-text” — you can say something like “winky face emoji,” and it’ll paste the relevant emoji for you. Google is also upgrading its Cloud Speech API, its voice transcription developer toolkit, with expanded support for audio timestamps and files up to three hours in length.

“[The] new expanded language support […] enables users in more countries to use speech to access products and services that up until now have never been available to them,” Dan Aharon, product manager at Google’s Cloud Platform, said in a blog post.




15
Aug

Quiet, please! Learn how to turn off the camera sound on an iPhone


They say the best camera is the one you have with you, and for most people, that camera is the one on their smartphones. iPhone photography keeps getting better every year, and the iPhone 7 and 7 Plus are no exception. The iPhone 7 Plus brings what is one of the most versatile dual camera setups around as it combines a wide-angle lens, with a telephoto lens you can use for portraits. This combination gives you a lot more room when it comes to composing your pictures just the way you want them.

If we are using our smartphones to take pictures everywhere, then having it make loud noises when you take a picture may not be ideal in all situations. Sometimes it isn’t appropriate to have a loud shutter sound when you are trying to take a picture. In this article, we will show you how to turn off the camera sound on an iPhone quickly and easily so that you can silence your camera, and take pictures without disturbing anyone.

Use the mute switch

The easiest way to mute the camera sound is to use the ring/silent switch on the upper left-hand side. When you turn on the silent switch, you will feel the iPhone vibrate, and the button will show an orange stripe. You should be aware that this will mute all notifications, incoming calls, and other alerts.

Turn down the volume

What do you do if the silent switch isn’t working? If your silent switch broke or isn’t working for some reason, turning down the volume may seem like an obvious way to do it, but if you do this while in the camera app, you will start taking pictures in burst mode.

If you know that you are going to use the camera app and want to mute the sounds, you can turn down the volume before you open the app using the volume down button on the left-hand side of your iPhone below the silent switch.

You can also turn down the volume while on the camera app by swiping up from the bottom to reveal Control Center, and then swipe left to reveal the volume control. You can turn down the volume from here.

Country restrictions

Did you know that in some countries it is mandatory to have the camera shutter sound on at all times? It’s not law, but wireless carriers have imposed it and phone manufacturers have followed suit. In countries like Japan and Korea, all smartphones must make a sound when using the camera app to take a picture. If you’re in these countries, we are sorry to say that even if you mute the iPhone, your camera app will still make a sound.




15
Aug

How would Mozart play ‘Hotline Bling?’ AI will soon help us find out


Why it matters to you

Ever wanted to know what an earlier artist’s take on a modern hit would sound like? Artificial intelligence may soon help us find out.

Remember wannabe pop star Rebecca Black’s much-maligned song Friday from a few years back? As poor as the song itself was, it did give us one brilliant spinoff: The enterprising work of YouTuber HeyMikeBauer, who performed a cover of the song in the style of legendary folk singer Bob Dylan.

If you liked that (and, based on its YouTube views, a whole lot of you did), a new artificial intelligence project may be exactly what the doctor ordered. What researchers at the U.K.’s Birmingham City University are working on is a neural network project they hope will one day predict how a piece of music might have sounded had it been created by an earlier artist — and then generate it for you. Looking for a Pink Floyd cover of Jay-Z? How about a Beethoven symphony re-creating (or, well, pre-creating) The Beatles’ seminal Sgt. Pepper’s Lonely Hearts Club Band? You’ve come to the right place!

“The idea is that we could train a neural network with the work of a musician,” Islah Ali-MacLachlan, senior lecturer in sound engineering, told Digital Trends. “We would use a range of tracks as an input, and the network would automatically detect the start and end of each individual note, the harmonic content, and other important classification data. Based on this we would then input your playing — perhaps a melody or guitar solo — and the system would change your audio. Imagine the phone apps that turn your photo into a Monet or Van Gogh — this would do the same for recordings.”

Ali-MacLachlan says that the project is still in its early stages, with the focus right now being on traditional Irish flute music. “It is difficult for a computer to determine when a note changes when there may not be a pronounced attack like a plectrum hitting a string or a stick hitting a drum head, but we have a system that can deliver 90 percent accuracy in some contexts,” he said. “We have also developed some techniques for classifying timbre and looking at key differences between players. At present, we are working on being able to automatically define different notes to train the neural networks and from there we will start to look at how we can influence the outputs.”

The overall goal is enormously ambitious, but, hey, wouldn’t we have said the same thing about self-driving cars or computers that can beat humans at Go just a few years back? With AI increasingly capable of learning to impersonate voices based on training data, this may be closer than we think.




15
Aug

How would Mozart play ‘Hotline Bling?’ AI will soon help us find out


Why it matters to you

Ever wanted to know what an earlier artist’s take on a modern hit would sound like? Artificial intelligence may soon help us find out.

Remember wannabe pop star Rebecca Black’s much-maligned song Friday from a few years back? As poor as the song itself was, it did give us one brilliant spinoff: The enterprising work of YouTuber HeyMikeBauer, who performed a cover of the song in the style of legendary folk singer Bob Dylan.

If you liked that (and, based on its YouTube views, a whole lot of you did), a new artificial intelligence project may be exactly what the doctor ordered. What researchers at the U.K.’s Birmingham City University are working on is a neural network project they hope will one day predict how a piece of music might have sounded had it been created by an earlier artist — and then generate it for you. Looking for a Pink Floyd cover of Jay-Z? How about a Beethoven symphony re-creating (or, well, pre-creating) The Beatles’ seminal Sgt. Pepper’s Lonely Hearts Club Band? You’ve come to the right place!

“The idea is that we could train a neural network with the work of a musician,” Islah Ali-MacLachlan, senior lecturer in sound engineering, told Digital Trends. “We would use a range of tracks as an input, and the network would automatically detect the start and end of each individual note, the harmonic content, and other important classification data. Based on this we would then input your playing — perhaps a melody or guitar solo — and the system would change your audio. Imagine the phone apps that turn your photo into a Monet or Van Gogh — this would do the same for recordings.”

Ali-MacLachlan says that the project is still in its early stages, with the focus right now being on traditional Irish flute music. “It is difficult for a computer to determine when a note changes when there may not be a pronounced attack like a plectrum hitting a string or a stick hitting a drum head, but we have a system that can deliver 90 percent accuracy in some contexts,” he said. “We have also developed some techniques for classifying timbre and looking at key differences between players. At present, we are working on being able to automatically define different notes to train the neural networks and from there we will start to look at how we can influence the outputs.”

The overall goal is enormously ambitious, but, hey, wouldn’t we have said the same thing about self-driving cars or computers that can beat humans at Go just a few years back? With AI increasingly capable of learning to impersonate voices based on training data, this may be closer than we think.




15
Aug

O6 smart bluetooth remote review


Picture this: You’re lounging at home, streaming tunes from your smartphone to a Bluetooth speaker. You start to nod off, but there’s a problem — your phone’s sitting on a table across the room. Fingertips Labs thinks it has the solution: A remote control for iPhones and iPads. Here’s our O6 smart bluetooth remote review.

The O6 isn’t anything like your TV or stereo system’s remote. The lightweight, all-metal puck consists of a textured center button, an outer ring button, a rotating dial, and a tactile button that supports single-click, double-click, triple-click, and press-and-hold gestures. A rechargeable battery supplies the O6 with up to 10 days of power, and embedded magnets on the back stick to magnetized surfaces like refrigerators and car dashboards.

The O6’s real magic are its channels.

It isn’t meant to replace voice controls, hand gestures, or touchscreens, but to keep you focused on tasks at hand.

“The world has changed drastically in the last 10 years,” the O6 team wrote in a statement provided to Digital Trends. “With ubiquitous Wi-Fi and robust cellular plans, we have gone from ‘work phones’ and desktop computers to being always connected. We can’t stop people from feeling the need to check their mail and social media, but the O6 […] allows them to do it without taking their phone off the [sidewalk or] road.”

Whether or not it achieves that goal is subject for debate, but the O6 worked as advertised in our testing — though it will depend on which apps you use most often.

Channel surfing

We were impressed with the simplicity of the O6 remote’s setup process. Once you install the O6 companion app on your iPhone or iPad, pair it via Bluetooth, and grant it access to your device’s notifications and contacts list, you’re ready to start controlling apps with dials and buttons.

The O6 ships with a couple of controls pre-configured. By default, a single click of the center button plays/pauses music and selects items in the O6’s companion app. A double-click brings up programmable actions in the O6 app (more on these later), and a single-click of the ring button takes you back to the O6 app’s main menu.

The O6’s real magic, though, are its channels, or native app integrations. A channel for read-it-later service Pocket reads aloud articles you’ve saved with the Pocket browser extension, and a channel for National Public Radio‘s One app lets you like, skip, and rewind podcasts with the O6 remote’s dial and center button.

We tried activating Twitter, one of the suggested channels, first. We logged into a Twitter account via the O6 app, and our iPhone 6S Plus started reading the newest tweets in our timeline. Rotating the O6’s dial clockwise skipped to the next tweet, and rotating it counter-clockwise went to a previous one.

The second channel we took for a spin, Gmail, worked just as well. We scrolled through unread emails in our inbox using the O6’s rotary dial, and selected the Reply button with a combination of single- and double-taps of the center ring. You’ll have to pick up your phone to actually reply, though, or use O6’s smart replies.

But by far the most useful channel is the notifications channel, which reads aloud alerts from apps like Messages, WhatsApp, Facebook, and Facebook Messenger. Incoming alerts populate the notifications channel automatically, where they can be dismissed with a tap of the O6’s center button.

The notifications channel tends to fill up quickly when you have lots of apps installed, and that’s where the O6’s haptic feedback comes in. The remote’s vibrating motors can play over 200 different haptic effects that can be assigned however you choose. You can program a triple buzz to iMessage notifications, for example, or a long single vibration to Facebook Messenger chats.

You can also use the O6’s haptics to tell the time, though we don’t recommend it. A Morse Code-like series of long and short buzzes indicate the time, but it’s not a particularly intuitive system — one long haptic buzz indicates the number five, and one short buzz indicates the number one.

The O6’s also supports actions, or shortcuts to in-app settings, buttons, and options. One lets you respond to texts with predetermined responses. Another, an action for phone calls, worked particularly well in our testing — when we got a message containing a phone number, the O6 app extracted it automatically and let us place a call with a click of the center button.

Other apps and settings

As smooth as the O6’s channels and actions were, though, we found ourselves using the remote’s system-level controls more often.

When a call or interactive alert (like an alarm) comes in, the O6 vibrates and switches to Smart Response Mode, which lets you accept or decline notifications by clicking the O6’s ring and bezel buttons. During an incoming call, for instance, tapping the center button accepts the call, and tapping the ring button declines it.

When the O6 isn’t launching channels or dismissing notifications, it acts like a Bluetooth volume knob, mapping volume to the remote control’s dial and playback to the center and ring button in Spotify, Netflix, YouTube, Pandora, Sonos, Apple Music, and other apps. But there’s a big caveat: You have to start those apps manually, either from the iPhone’s touchscreen or Siri. That seems like a bit of an oversight for what’s ostensibly a remote control.

Using the O6 to control other apps gets complicated. The remote’s Advanced Mode, which lets you perform actions in apps that weren’t designed with the O6’s controls in mind, uses Apple’s VoiceOver accessibility API to work around iOS’s control limitations. In the YouTube app, for example, you can scroll through a list of videos by rotating the remote’s dial, click the center button to select a video (double-clicking the center button plays/pauses it), and use the O6’s ring button to return to the home screen.

In our testing, Advanced Mode tended to be a little unpredictable — it wasn’t always clear which of the remote’s buttons would trigger the desired button/action/option. And switching it on, which requires triple-clicking the iPhone’s home button and launching the O6 app, was a pain.

So just how useful is the O6, really? That depends on which apps you use on a daily basis, and how much you’re willing to compromise on the O6’s hands-free vision. It doesn’t replace a touchscreen — you’ll spend a good deal of time tapping touchscreen shortcuts and pecking out messages with your fingertips. But it’s certainly useful when your phone’s out of reach and you just want to turn down the volume, or when you’re cooking and can’t be bothered to wash your hands. It could be a helpful assistant when driving too.

But the O6 will live and die by its developer support. The remote ships with an open API, but channel support is rather limited right now. As it stands, the O6 is great for acting on notifications, adjusting your phone’s volume, and skipping through articles, tweets, and emails. Sadly, all of that is not worth the $100 asking price.

The O6 comes in orange, blue, and grey. It’s available now, and ships with accessories including a steering wheel mount ($20) and pocket clip mount ($20).




15
Aug

New CRISPR technique could accelerate a cure for Huntington’s disease and ALS


Why it matters to you

While gene editing has been a historically contentious practice, the possibility of curing diseases like Huntington’s disease and ALS help highlight the practice’s benefits.

Just a few weeks ago, we marveled at the first instance of human embryo editing with CRISPR in the United States. Now, the gene editing technique has been used for yet another impressive purpose. Scientists at the University of California San Diego believe that a modified version of the CRISPR/Cas9 tool could be used to track RNA in live cells using a method known as RNA-targeting Cas9. But more importantly, this methodology could allow doctors to fix molecular mistakes that result in diseases like myotonic dystrophy types 1 and 2, which are the most common form of hereditary ALS and Huntington’s disease.

These types of diseases occur when errors in RNA sequences prevent the production of key proteins. However, with RNA-targeting Cas9, researchers were able to get rid of the RNA errors, particularly those linked to ALS and Huntington’s. In fact, so effective was this new methodology that scientists were able to remove more than 95 percent of the problematic RNA.

“This is exciting because we’re not only targeting the root cause of diseases for which there are no current therapies to delay progression, but we’ve re-engineered the CRISPR-Cas9 system in a way that’s feasible to deliver it to specific tissues via a viral vector,” said senior author Gene Yeo, professor of cellular and molecular medicine at the UC San Diego School of Medicine.

Yeo’s team also found that applying RNA-targeting Cas9 managed to reverse 93 percent of dysfunctional RNA targets in muscle cells, ultimately turning those cells back into what appeared to be healthy control cells. But while these are all promising results, for the time being, the technique has only been tested in lab settings.

“The main thing we don’t know yet is whether or not the viral vectors that deliver RCas9 to cells would illicit an immune response,” Yeo said. “Before this could be tested in humans, we would need to test it in animal models, determine potential toxicities and evaluate long-term exposure.”

Still, the new methodologies could be groundbreaking in the medicine field.

As David Nelles, co-first author of the study said, “We are really excited about this work because we not only defined a new potential therapeutic mechanism for CRISPR-Cas9, we demonstrated how it could be used to treat an entire class of conditions for which there are no successful treatment options.”




15
Aug

Aukey’s magnetic Bluetooth headphones are down to $17 at Amazon


Our friends at Thrifter are back again, this time with a great deal on a popular set of Bluetooth headphones.

Bluetooth headphones are quickly becoming more popular, but not everyone wants to drop hundreds of dollars on a pair of headphones. Luckily, you don’t have to. Right now, you can pick up Aukey’s magnetic Bluetooth headphones for just $16.99 at Amazon when you use the coupon code IZJM7G4Q at checkout. The earbuds are magnetic, so when you have them around your neck they can clasp together so you don’t have to worry about them falling off.

aukey-magnetic-bluetooth.jpg?itok=V6jmmc

  • Comfortable and secure noise-isolating in-ear headphones that deliver rich, robust sound with punchy bass. Featuring aptX technology for purer wireless audio
  • Magnetically clip together for convenient and secure carry around your neck. Great for use on-the-go; walking, commuting, traveling, and more
  • Connect quickly with Bluetooth 4.1, to two devices simultaneously, and effortlessly manage audio playback & calls with volume controls, multi-function button, and built-in 6th generation cVc noise-cancelling microphone
  • Comfortable ear-tips (in three sizes) provide a smooth seal for your ears, isolating you from external noise. IPX4-certified water-resistance ensures sweat and rain on your run don’t mess with your music

These may not compare to Bose or Beats headphones when it comes to the quality of the music coming out of them, but at a tenth of the price, you’ll likely be impressed. Grab a pair today to try out.

See at Amazon

More from Thrifter:

  • How to get the most out of your Amazon Prime membership
  • How to save money when driving

For more great deals be sure to check out our friends at Thrifter now!

15
Aug

No more service fees and wait times: Fix your electronics yourself for $7


Our friends at Thrifter are back again, this time with a deal on an awesome screwdriver set!

Jackyled’s 45 in 1 Precision Screwdriver Set is currently available at Amazon for just $6.92 when you enter promo code ZDAU28AE at checkout to save $4 off this item’s regular price.

precision-screwdriver-jackyled.jpg?itok=

Featuring a tweezer, handle, extension bar and 42 screwdriver bits, this set is a great option for repairing smaller objects like phones, laptops, watches or eyeglasses. It comes with a non-slip cover for the handle to assist your grip while you work, and the bits are magnetic which helps keep them securely attached to the handle.

This tool set has a 4.5 out of 5-star ranking on Amazon and is a #1 Best Seller on the site.

Need some repair tips? Check out iFixit which provides tons of repair tutorials for common tech.

See at Amazon

More from Thrifter

  • Tips for becoming an expert eBay seller
  • 5 free travel apps to help you save big on hotel stays

For more great deals be sure to check out our friends at Thrifter now!

15
Aug

Samsung’s next Gear Fit will let you plumb the briny depths and listen to music without your phone


Samsung’s next Gear Fit fitness band will include standalone GPS and offline Spotify playback.

Like its phones and tablets, Samsung makes different wearables to fit different roles. One of these roles is the traditional watch, filled by the Gear S line. The other is that of a fitness band, filled by the Gear Fit line. The first Gear Fit tracker was announced in early 2014, and Samsung has been steadily improving the line since then.

samsung-gear-fit-2-pro-press.jpg?itok=QX

Most of the time, Samsung announces a new Gear device when it announces a new Galaxy flagship, and it looks like that will be the case again. According to VentureBeat, Samsung will announce the Gear Fit2 Pro alongside the upcoming Galaxy Note 8 on August 23. The wearable is said to be IP68 dust- and water-resistant, an important factor for a fitness-focused device. It will also be submersible to 5 atmospheres of pressure (5 ATM), or approximately 130 feet. So you should have no problem using it at the local pool.

samsung-gear-fit-2-pro-press-2.jpg?itok=samsung-gear-fit-2-pro-press-3.jpg?itok=

Another great addition is the ability to download and listen to your Spotify playlists without needing your phone, which will be great for users who want to leave their phone at home or in the car during their workout. There’s also the standard step and workout tracking, as to be expected. The fitness band also includes standalone GPS for tracking long runs without a phone. VentureBeat claims this data backs up to the Speedo On application, but that application does not currently exist in the Play Store. It’s also hard to imagine Samsung wouldn’t use its existing health app. Finally, it looks like the clasp will provide a much more secure fit (pun intended).

There’s no word on availability or pricing just yet, but we should find out soon enough. Are you looking forward to the new Samsung Gear Fit2 Pro? Let us know down below!

Learn more about the Samsung Gear Fit2Pro!

15
Aug

How did Amazon screw up the Echo Show’s best feature so badly?


echo-show-headlines.jpg?itok=54WHZNWg

Echo Show is good at showing headlines, but very bad at showing anything that’s actually important. And that’s a problem.

It’s Sunday morning. Past the breakfast hour and closing in on lunch. I’ve been trying to come up with ways to make myself feel better after seeing the carnage in Charlottesville, N.C., and the predictable responses on Twitter and Facebook and from our political leaders. It’s times like these that I don’t want to think at all about tech toys. (And to be clear, this is hardly the first time. Or the second. Or the third. And I’m hardly alone in this feeling.)

But something stood out as I stood in the kitchen making breakfast. And it took me a few hours before I realized what it was.

It was the Amazon Echo Show. Alexa with a screen. I’d chuckled a little earlier in the day reading blogger-turned-investor-turned-blogger M.G. Siegler’s “Quick Thoughts on Amazon’s Echo Show.”

What really sold me was that while I was making coffee, it was next to me displaying news headlines. … This sounds obvious. I mean, we all walk around every single day with devices in our pockets that can access any information — including news headlines — at any time. But there’s something profound about having it pushed to you in an ambient way.

I agree. And once you’re bludgeoned with information the way I was at a newspaper starting from 19 years old — it was my job to try to tame the waterfall — it’s a hard habit to give up. Echo Show is perfect for this. Or, rather, it can be. Eventually.

If it’s not timely, and it’s not important, then why is it being pushed in front of my eyes?

I can say this with certainty: The afterglow of Echo Show headlines will wear off pretty quickly. Maybe it’ll be when you wonder why you’re seeing a headline that’s two hours old (an eternity in online news time). Or maybe it’s when you’ve seen 13 headlines in a row that you just don’t care about. The image at the top of this post — promoting a “Game of Thrones” Episode 5 preview, is showing the day after the episode aired. What good is that?

Or maybe it’s the morning after a domestic terrorist event when you’re walking through the kitchen and don’t see a single headline about it on the Echo Show.

That’s right. Not a word about Charlottesville and the racist Nazis who directly contributed to the death of a woman. (And indirectly to the deaths of two law enforcement officers whose helicopter crashed.)

Not a single headline that I saw in the morning. Or in the 10 minutes I left a camera trained at the Echo Show.

Something about the #EchoShow headlines on Sunday morning struck me as odd. … Catch what’s missing? pic.twitter.com/BOf36vsa2d

— Phil Nickinson (@mdrndad) August 14, 2017

As I’m writing most of this piece about 9 hours later, I still don’t see any headlines about Charlottesville. … Fast-forward to Monday morning. … Still nothing. No headlines. No videos. No still images.

Echo Show isn’t exactly a font of information just yet. At least nothing timely. Or of any real import.

Does Amazon worry about our showing us anything remotely provocative? Or is it just bad at this?

The question now is why. I don’t think Amazon’s doing anything nefarious here. And I don’t even think it’s about Charlottesville or the current political landscape. I think it’s probably more a matter of not wanting to surface anything too provocative or potentially upsetting. And there’s something to be said for that.

In fact, that’s pretty much what Amazon said for that when I asked. Here’s a quote from a company representative:

For trending topics on Echo Show, we primarily surface lifestyle, entertainment, and sports news since it’s a communal device that the whole family sees and uses. If customers want to hear business or political news, we offer the daily Flash Briefing which offers a variety of news outlets to choose from. As with everything we do, the Echo Show trending topics experience will continue to improve and evolve over time based on customer feedback.

Fair enough, though I’d still argue the world ain’t always a pretty place. It’s not really protecting anyone here.

The good news is that this is an easy problem to fix. In lieu of actually improving the headlines feature itself, Amazon could let the user tailor the options. More news, less fluff. More from one source over another. It’s limitless, really.

The problem right now is that the Echo Show headlines are extremely limited. And dated. And that just makes Echo Show — and Amazon — look silly and out of touch.

Amazon Echo

  • Amazon Echo review
  • Echo Dot review
  • Top Echo Tips & Tricks
  • Tap, Echo or Dot: The ultimate Alexa question
  • Amazon Echo vs. Google Home
  • Get the latest Alexa news

See at Amazon