Skip to content

Archive for

31
May

A.I. detects skin cancer better than dermatologists in international study


Skin cancer detection won’t be turned over to machines anytime soon, but artificial intelligence detected skin cancer more accurately than a large group of international dermatologists in controlled testing, Agence France Presse reports.

In an academic study and clinical trial published in Annals of Oncology, the study’s lead author, Professor Holger A. Haenssle, of the University of Heidelberg Department of Dermatology, wrote, “Most dermatologists were outperformed by the CNN. Regardless of any physician’s level of experience, they may benefit from assistance by a CNN’s image classification.”

Man versus machine

The study pitted 58 dermatologists from 17 countries against a deep learning convolutional neural network (CNN).

Prior to the test, researchers from Germany, France, and the U.S.  taught the CNN to differentiate benign skin lesions from dangerous melanomas. In the process, the team showed more than 100,000 images of correctly identified skin cancers to the neural network, which was designed with Google’s Inception v4 CNN architecture.

The 58 dermatologists were divided into three self-identified groups: beginners with less than two years of experience, skilled with two to five years, and experts with more than five years of experience. There were 19 beginners, 11 skilled, and 30 experts among the group.

Two tests were run. In one test the dermatologists were shown 100 dermoscopic images with no other information. They were asked to indicate whether the cancer was a melanoma or benign. In addition, the doctors were asked whether they would recommend excision, short-term follow-up, or no action. Four weeks later the dermatologists were shown the same images again, this time with additional clinical information about the patients plus close-up images.

The results

The CNN scored higher than the overall group of dermatologists on both tests, with and without extra information. The dermatologists accurately identified an average of 86.5 percent of the skin cancers on the image-only test. In the second test, with more information, the doctors averaged 88.9 percent accuracy.  The CNN, however, correctly detected the types of cancers 95 percent of the time based on images only.

Rated by experience group, none of the three groups of dermatologists was as accurate as the neural network. The team did report, however, that 18 of the dermatologists scored higher than the CNN.

“The CNN missed fewer melanomas, meaning it had a higher sensitivity than the dermatologists,” Haenssle said. It also “misdiagnosed fewer benign moles as malignant melanoma … this would result in less unnecessary surgery.”

According to the authors of the study, the test does not mean machines will replace doctors. One issue is that melanomas can be difficult to recognize or image in some parts of the body such as the toes and scalp. The study calls for repeated, large-sized clinical tests.

The test does show, however, that dermatologists at all skill levels could benefit from A.I. assistance in skin cancer classification.


31
May

Tummy ache? Swallow this sensor-studded pill to get a diagnosis on your phone


Imagine popping a pill which can then monitor your insides for potential signs of poor health. That is what a team of researchers from the Massachusetts Institute of Technology has been working on — only instead of being the kind of soluble pill your doctor may prescribe, this one is a pill-sized ingestible capsule designed to monitor blood in the gastrointestinal tract. About the size of a pen cap, the prototype sensor combines electronics with useful bacteria. With this fearsome combination, it can detect signs of excessive bleeding in the gut, and then transmit the results to your smartphone.

“We’ve developed a new type of ingestible sensor by packaging living bacterial sensor cells together with readout electronics into a small capsule,” Phillip Nadeau, a former postdoctoral associate at MIT, told Digital Trends. “The cells were genetically engineered to start glowing when they detected heme molecules released during a stomach bleed. This low level of light given off by the cells was detected by the electronics in the capsule, and a signal representing the light level of the cells transmitted outside the body to a user’s cellphone. The advantage of using cells is that they are able to perform detection in harsh environments like the GI tract, and in principle they can be engineered to sense many different types of molecules.”

Lillie Paquette/MIT School of Engineering

Long term, the team envisions the device being swallowed by patients at home to provide a biochemical picture of their gut. Doing so would allow them to more easily diagnose or manage a range of diseases, including gastric ulcers, inflammatory bowel disease, and colorectal cancer. It could potentially do this, while also lowering the need to perform invasive procedures such as colonoscopies and biopsies for these conditions.

The project is still in prototype phase and none of the sensors have actually been ingested by a human. They have, however, been successfully tested on the guts of a pig, and the team hopes that human trials could be a possibility going forward. To do this, they will have to find a way to further miniaturize the sensor — which comprises a microprocessor, button-cell battery, and wireless transmitter — without reducing its abilities.

“We teamed up with Dr. Giovanni Traverso and his group at MIT to validate the prototype device in a pig model of gastric bleeding, and showed that it worked there,” Mark Mimee, a Ph.D. student at MIT, told us. “In the future, we’re interested in expanding the functionality of the device to other markers of gastrointestinal disease, with a focus on markers of inflammation. Additionally, we’re working on further miniaturizing the electronic components of the device and shrinking the power consumption and battery size to lower the overall footprint, and mitigate the risk of complications.”

Editors’ Recommendations

  • Need to cut back on your salt? In-mouth sensor tracks sodium intake in real time
  • Big things do come in small packages. Here’s the tiniest tech in the world
  • 3D-printed wearable monitors gastrointestinal activity, if you can stomach it
  • A self-driving car in every driveway? Solid-state lidar is the key
  • Sobro is the Multi-Functional Coffee Table Your Home is Missing


31
May

Tummy ache? Swallow this sensor-studded pill to get a diagnosis on your phone


Imagine popping a pill which can then monitor your insides for potential signs of poor health. That is what a team of researchers from the Massachusetts Institute of Technology has been working on — only instead of being the kind of soluble pill your doctor may prescribe, this one is a pill-sized ingestible capsule designed to monitor blood in the gastrointestinal tract. About the size of a pen cap, the prototype sensor combines electronics with useful bacteria. With this fearsome combination, it can detect signs of excessive bleeding in the gut, and then transmit the results to your smartphone.

“We’ve developed a new type of ingestible sensor by packaging living bacterial sensor cells together with readout electronics into a small capsule,” Phillip Nadeau, a former postdoctoral associate at MIT, told Digital Trends. “The cells were genetically engineered to start glowing when they detected heme molecules released during a stomach bleed. This low level of light given off by the cells was detected by the electronics in the capsule, and a signal representing the light level of the cells transmitted outside the body to a user’s cellphone. The advantage of using cells is that they are able to perform detection in harsh environments like the GI tract, and in principle they can be engineered to sense many different types of molecules.”

Lillie Paquette/MIT School of Engineering

Long term, the team envisions the device being swallowed by patients at home to provide a biochemical picture of their gut. Doing so would allow them to more easily diagnose or manage a range of diseases, including gastric ulcers, inflammatory bowel disease, and colorectal cancer. It could potentially do this, while also lowering the need to perform invasive procedures such as colonoscopies and biopsies for these conditions.

The project is still in prototype phase and none of the sensors have actually been ingested by a human. They have, however, been successfully tested on the guts of a pig, and the team hopes that human trials could be a possibility going forward. To do this, they will have to find a way to further miniaturize the sensor — which comprises a microprocessor, button-cell battery, and wireless transmitter — without reducing its abilities.

“We teamed up with Dr. Giovanni Traverso and his group at MIT to validate the prototype device in a pig model of gastric bleeding, and showed that it worked there,” Mark Mimee, a Ph.D. student at MIT, told us. “In the future, we’re interested in expanding the functionality of the device to other markers of gastrointestinal disease, with a focus on markers of inflammation. Additionally, we’re working on further miniaturizing the electronic components of the device and shrinking the power consumption and battery size to lower the overall footprint, and mitigate the risk of complications.”

Editors’ Recommendations

  • Need to cut back on your salt? In-mouth sensor tracks sodium intake in real time
  • Big things do come in small packages. Here’s the tiniest tech in the world
  • 3D-printed wearable monitors gastrointestinal activity, if you can stomach it
  • A self-driving car in every driveway? Solid-state lidar is the key
  • Sobro is the Multi-Functional Coffee Table Your Home is Missing


31
May

With software ‘magic,’ Insta360 will soon allow 8K playback on smartphones


Insta360

Viewing immersive content hasn’t yet caught up to the increasing resolution of 360 cameras — but 360 camera manufacturer Insta360 has developed a solution that will allow viewers to watch 8K immersive content, even if the only thing they have is a smartphone. On Wednesday, May 30, Insta360 announced CrystalView, a tech that will allow 8K playback even on mobile devices that don’t yet support 8K by using different algorithms to prioritize only the content currently being viewed. The announcement also comes with FarSight, a new remote system that allows videographers to hide the crew by operating the camera from up to 1.86 miles away.

CrystalView aims to help bring high-resolution content to headsets and smartphones that only support a 4K resolution. How? The software downsamples the portions of the video that the viewer isn’t looking at. That reduced resolution allows the program to prioritize the view that is actually being watched. 

By sectioning out the video and prioritizing what is actually on the screen at the time, CrystalView allows the 8K resolution 360 videos to be not quite so taxing on the hardware that is trying to manage such a large file. An 8K 360 video has 8,000 pixels stretched across the entire 360 view, which means, in many cases, it’s the file size and not the screen resolution that limits the playback quality.

CrystalView’s less demanding format will help when streaming on demand, as well as for reviewing videos stored on the device, Insta360 says. The concept is first coming to Insta360 Player this summer, but the company also plans to make a software development kit that will allow other companies to add the feature to their playback programs. The update is coming to Insta360 Player sometime this summer, the company says.

Insta360

For 360 creators, one of the challenges of shooting in every direction is hiding the crew from the footage. Insta360 is working to make that a simpler task to tackle with the new FarSight system for the Insta360 Pro. FarSight uses a pair of transmitters and receivers to send the video a wider range than built-in Wi-Fi allows.

The receiver can display 1080p video, allowing a real-time preview and control of the camera. The system works up to 0.31 miles away ground-to-ground, or 1.86 miles air-to-ground. The system is slated for a summer release, with compatibility with Mac, Windows, iOS, and Android, as well as tablets and routers.

On Wednesday, May 30, Insta360 also announced that an Adobe Premiere Pro extension is coming this week. The previously announced plug-in saves stitching as the last step while still allowing for stitched previews, which helps save time by giving the computer a smaller file to work with until the edits are completed. The Insta360 Stitcher 1.7.0 will export a fully stitched version at the end. The beta version is coming in the next few days, the company says.


31
May

How to download the data Apple has about you


Europe’s new, more insistent data privacy and control guidelines are going into effect at last. It’s no surprise then to see Apple has making some updates to its own privacy policies. Some of these updates help Apple meet specific GDRP requirements, including the interesting ability to access all the data that Apple has collected about its users.

Apple’s rollout is likely delayed in some areas for logistics reasons, so the ability to access all your data will be limited by region and may take a while to roll out even in the US. However, if you’ve ever used an Apple service and are located in the US or Europe, let’s walk through exactly how to download the data Apple has on you.

Step 1: Visit the Apple privacy website and sign in

Fortunately, Apple has provided a website dedicated to seeing all the collected data for its accounts. Your first step is to visit the Apple Data and Privacy web page.

Here you will see a web form to sign in using your Apple ID and password. If it’s been a while, there’s also a blue link if you “forgot Apple ID or password” that will allow you to change them. If it helps, your Apple ID is typically an email address.

Step 2: Enter an authentication code

If two-factor authentication/verification is enabled for your account (Apple encourages this), then you’ll need to take an extra security step. Apple will shoot you an authentication code, which you need to input on the website to access your account data. Make sure it is correct as you type it in!

Note that if you have another Apple device, the default is to send the code to that device, so you will want to have it handy for this step.

Step 3: View your Apple services

You will now see a screen with a list of all the Apple services that your account is connected to under the heading “Get a copy of your data.”

This includes services like the App Store and iTunes, your Apple devices, the Apple online store, AppleCare, the various aspects of iCloud, and much more. Some of these have options to expand into more specific sections.

To the right, you will see checkboxes that all you to select each separate service, with an option up top to select all of them. Select all the services that you want to receive data from, and then select “Continue.” It may be worth browsing a little at this stage before moving on so that you can note all the services Apple has connected to your account and if this appears accurate to you.

This is where you may run into regional trouble, in which case no services will appear. If Apple’s privacy website is particularly busy, you may also get an error message here, in which case you should try to log in again in a few minutes.

Step 3: Choose your data delivery options

Apple will now take you to a page to summarize the apps and services you are requesting data from. It will also ask you to choose a “maximum file size.” This is the maximum size of the files that Apple will provide for you to download.

Apple will divide all your user data into packages of this size and send them consecutively. Choose a size that fits your computer’s speed and storage. It’s smart not to go too large if you have a lot of Apple services, or you’ll get a massive file that may be tough to download and transfer.

When you are finished, choose “Complete request.”

Step 4: Wait

As you can imagine, Apple is seeing a lot of requests for data. You may have to wait several days or longer for Apple to process your request and send your data to your email. Wait times of up to a week or more are currently possible, so don’t count on a turnaround within a day or two.

When you receive your data files, remember that Apple also provides you with an option to correct your personal information or delete your account, depending on what you find.

Editors’ Recommendations

  • You’ll never read Facebook’s new data policy, so we did it for you
  • How to delete your Facebook account
  • Save data, save money: How to reduce your data usage on Android or iOS
  • How much of your data is Apple collecting? Not much, and here’s how to see it
  • Here are five tips to keep your data private on Facebook


31
May

LG V35 ThinQ vs. LG G7 ThinQ: Which phone prevails in stablemate showdown?


The LG G7 ThinQ isn’t even out yet, but LG is already taking the wraps off its next phone, LG V35 ThinQ. Being stablemates, these two phones have a lot in common, with powerful hardware, huge screens, and LG’s A.I. smarts. But when it comes down to the real nitty-gritty, which of these two should you be spending your hard-earned dollars on? We put them head-to-head in a specs comparison.

Specs

LG V35 ThinQ
LG G7 ThinQ

Size
151.6 x 75.4 x 7.4 mm (5.97 x 2.97 x 0.29 inches)
153.2 x 71.9 x 7.9 mm (6.03 x 2.83 x 0.31 inches)

Weight
158 grams (5.57 ounces)
162 grams (5.71 ounces)

Screen size
6-inch OLED
6.1-inch IPS LCD

Screen resolution
2,880 x 1,440 pixels (537 pixels per inch)
3,120 x 1,440 pixels (564 pixels per inch)

Operating system
Android 8.0 Oreo
Android 8.0 Oreo

Storage space
64GB

64GB, 128GB

MicroSD card slot
Yes, up to 2TB
Yes, up to 2TB

Tap-to-pay services
Google Pay, LG Pay (in South Korea only)
Google Pay, LG Pay (in South Korea only)

Processor
Qualcomm Snapdragon 845
Qualcomm Snapdragon 845

RAM
6GB
4GB, 6GB

Camera
Dual 16MP (with OIS) and 16MP wide-angle rear, 8MP front
Dual 16MP (with OIS) and 16MP wide-angle rear, 8MP front

Video
Up to 4K at 30 frames per second, 720p at 240 fps
Up to 4K at 30 frames per second, 720p at 240 fps

Bluetooth version
Bluetooth 5.0
Bluetooth 5.0

Ports
3.5mm headphone jack, USB-C
3.5mm headphone jack, USB-C

Fingerprint sensor
Yes
Yes

Water resistance
IP68
IP68

Battery
3,300mAh

QuickCharge 3.0

Qi wireless charging

3,000mAh

QuickCharge 3.0 (4.0 with adapter not included)

Qi wireless charging

App marketplace
Google Play Store
Google Play Store

Network support
AT&T, Project Fi
T-Mobile, Verizon, Sprint

Colors
Aurora Black, Platinum Gray
Aurora Black, Platinum Gray, Raspberry Rose, and Moroccan Blue

Price
$900
$750

Buy from

AT&T, Project Fi

Verizon, T-Mobile, Sprint

Review score
Hands-on review
3.5 out of 5 stars

Performance, battery life, and charging

Julian Chokkattu/Digital Trends

You’re likely to not see much of a difference between these two models when it comes down to performance, as both are using the latest Snapdragon 845 processor. It’s an exceptionally powerful chip, and most phones we’ve tested with it have proven to be lightning fast, with no issues handling even the most demanding games. We haven’t had a chance to fully test the LG V35 ThinQ yet, but we expect it to perform extremely well. The G7 comes with 4GB RAM with the base model, but the V35 ups it to 6GB of RAM.

We were slightly disappointed with the size of the G7’s battery on launch, but the 3,000mAh capacity proved capable enough to last a little more than a day. Again, we haven’t tested the V35 ThinQ properly yet, but we reckon the slightly larger battery size and OLED display should mean it’s a slightly stronger performer in battery life. Charging-wise, you can plunk either of these down on a Qi wireless charger, or connect with a USB Type-C charging cable for superfast QuickCharge 3.0 recharging.

There’s a lot that’s similar between these two phones, but the larger battery capacity and the extra RAM on the V35 ThinQ push it ahead. It wins this round.

Winner: LG V35 ThinQ

Design and durability

Julian Chokkattu/Digital Trends

There’s a lot to like on the LG G7 ThinQ, but design isn’t necessarily high on the list. It’s not a bad style — the notched display and bezel-less design marks it firmly as a 2018 flagship — but it simply doesn’t do enough to differentiate itself in terms of its looks. The V35 ThinQ is a little more uniform on the front, because there’s no notch, but since it’s a design we’ve already seen on the V30S and the V30, it’s hardly exciting.

Both phones feel very light in the hand, which lends an air of cheapness — not something you want to feel in a premium flagship. We don’t recommend dropping either of these phones, thanks to the large amount of glass used on both. Get a case for either one and your peace of mind will improve. Surprisingly, the G7 will be slightly more durable because it uses Gorilla Glass 5 on the front and back, as opposed to Gorilla Glass 4 on the V35. Both phones fare better in terms of water and dust-resistance, thanks their IP68-ratings. That should mean both of them can take a trip down the toilet or into the bath and come out mostly unscathed (though we don’t recommend testing that).

One other big difference is the power button. The V35 uses the fingerprint sensor as the power button, but the LG G7 adds a new power button on the right edge of the phone.

The LG G7 ThinQ is slightly more durable than the V35, and we think its design is at least a little more interesting. It takes the win.

Winner: LG G7 ThinQ

Display

Julian Chokkattu/Digital Trends

We hope you like big screens, because both of these phones come with huge displays, thanks to bezel-less designs and extended aspect ratios. The G7 ThinQ comes with the larger screen, sporting a 6.1-inch Super LCD screen with a 3,120 x 1,440 resolution, and delivering a super-sharp 564 pixels per inch (ppi). The V35 ThinQ has a smaller 6-inch screen and a smaller ppi of 537, but with a resolution of 2,880 x 1,440 pixels, it’s likely tough to notice a difference.

There is a difference between the types of screens used. While the G7 ThinQ uses a Super LCD screen, the V35 ThinQ is rocking an OLED screen, with all the advantages that screen tech delivers. The Super LCD on the G7 ThinQ looks great, but it can’t offer inky blacks like an OLED display. The G7 does have a trick up its sleeve: Super Bright Mode. Tap a button on the brightness slider and the screen will hit 1,000 nits — plenty of brightness to see the display outdoors in daylight. It only lasts 3 minutes, but it makes the screen far easier to read than an OLED outdoors.

The notch design on the G7 is a point of contention. You can turn it off via software in the settings, but you won’t have to deal with this problem at all on the V35.

There are advantages and disadvantages to both screens. While we love OLED displays, the Super Bright Mode is certainly handy outdoors, and the sharper screen helps,too. This round’s a tie.

Winner: Tie

Camera

We haven’t had much chance to test out the LG V35 ThinQ’s camera yet, but it has the exact same camera we saw in the LG G7 ThinQ. Both phones’ camera suites are comprised of two 16MP lenses on the back — one standard, one wide-angle. These cameras are quite capable, with the wide-angle delivering some rather spectacular shots on occasion — but it’s when the lights go down that performance really takes a hit. Despite LG’s Super Bright mode, the G7 ThinQ just doesn’t deliver in low-light situations — and based on the numbers, the V35 ThinQ will probably suffer from the same issue.

The V35 ThinQ will also come with the same A.I.-powered camera smarts that will change your settings based on what’s in frame, 4K video recording, and a 240 fps slow-motion video mode. There’s nothing different here, so this category is a tie.

Winner: Tie

Software and updates

Julian Chokkattu/Digital Trends

It’ll likely come as no surprise to anyone that you’ll find much the same versions of Android 8.0 Oreo on both of these LG flagships. It’s not stock Android and both have a couple of LG’s pre-installed apps, but there are plenty of settings to help you customize the phone to your liking.

In terms of updates, as flagships, we expect these two will get a good few years of updates from LG, and will most likely be updated to Android P. Judging by past performance, you can expect the LG V35 ThinQ to get an update before the G7 ThinQ, but there’s unlikely to be a huge difference between the two. This is a tie.

Winner: Tie

Special features

We’re seeing more and more A.I. integration in the market, and LG’s products are no stranger to A.I. smarts. The “ThinQ” name indicates an entry into LG’s A.I.-assisted world, and that means that both the LG G7 ThinQ and V35 ThinQ will be able to talk to any of LG’s other ThinQ-branded products. But it doesn’t end there (thankfully), as LG has ramped up integration of the Google Assistant. The G7 ThinQ comes with a dedicated button that can be used to talk to the Google Assistant without having to say a hot word — similar to how the Bixby button works on Samsung phones. The V35 ThinQ doesn’t have a button, but like the G7, you can ask Google Assistant to complete specific phone-related tasks.

The BoomBox speaker was one of the more impressive parts of the G7 ThinQ, using the inside of the phone as an echo chamber to enhance sound quality. The V35 ThinQ doesn’t have this specific feature, but it does have get a Quad DAC and DTS-X Virtual Surround Sound like the G7.

Winner: LG G7 ThinQ

Price

The LG G7 ThinQ is currently available for pre-order, and will ship June 1, with prices starting from $750. It will be available on all carriers except for AT&T. Pre-orders for the LG V35 ThinQ will start June 1, with the phone shipping June 8. It will cost $900, and it will be sold exclusively on AT&T and Google’s Project Fi network.

Overall winner: LG G7 ThinQ

The LG V35 ThinQ may be the company’s latest phone, but we think the G7 ThinQ is worth the money. It has a slightly flashier design, and handy features like the Super Bright Screen, a dedicated Google Assistant button, and the Boombox speaker. But if you hate the notch, the more expensive V35 might be the phone for you if you’re on AT&T or Project Fi. The V35 does have a bigger battery, an OLED screen, and more RAM.

Editors’ Recommendations

  • LG G7 ThinQ vs. LG G6: Out with the old, in with the new
  • LG V35 ThinQ vs. LG V30: Is the newer model worth the extra cash?
  • OnePlus 6 vs. LG G7 ThinQ: Can the flagship killer sink LG’s latest?
  • LG G7 ThinQ vs. LG V30: Which LG flagship phone is best for you?
  • LG G7 ThinQ vs. iPhone X: Which phone comes out on top?


31
May

Schlage locks and Google Home team up to make your smart home safer


If you have a Schlage Sense Smart Deadbolt, your front door just unlocked a new capability.

Schlage announced that its smart locks will now work with Google Home, Google Assistant on Android devices and the Google Assistant app on Apple devices. The company first announced in January it was working on the integration at CES 2018, and the feature went live on Tuesday, May 29.

“Schlage has been at the forefront of IoT security solutions since launching the Schlage Sense Smart Deadbolt in 2014,” Rob Martens, a futurist at Allegion, the maker of Schlage locks, said in a statement. “This new integration and functionality is a testament to our ongoing commitment to raise the bar for innovation, convenience, and security and to support consumers’ choice of IoT platform.”

The update enables users to lock their door and check whether it’s locked or unlocked with their voice. Users can activate the capability by saying, “OK Google, is my door locked?” or, “OK Google, lock my door.” However, the upgrade does not include the ability to unlock the deadbolt with your voice.

Using the new functionality requires the Schlage Sense Wi-Fi Adapter, which Schlage sells separately $69. The adapter also enables users to remotely access their lock using the Schlage Sense app on iOS and Android devices.

The Schlage Sense Smart Deadbolt lets users create up to 30 unique access codes, schedule access codes and look at past activity to identify which codes were used when.

The smart locks already work Amazon’s Alexa and Apple’s Siri. Schlage introduced the ability to lock and check the status of your door via Alexa in October and added the ability to unlock it by speaking a pin earlier this month.

For security reasons, the Alexa unlock feature is disabled by default. To enable the functionality, users must log in to their Amazon Alexa app and then create a PIN. After three incorrect voice code attempts, the feature will be disabled.

The Google Home update makes the Schlage Sense Smart Deadbolt more competitive with other smart locks, such as the August Smart Lock and the Kwikset Obsidian, which are also compatible with Google Assistant.

The Schlage Connect and Schlage’s Connected Keypad do not yet have compatibility with Google Assistant. Consumers now await the next update, wondering whether that will include compatibility with these other smart locks and the ability to unlock your door via voice command.

Editors’ Recommendations

  • How to connect your smart home gadgets to your Amazon Alexa device
  • Thanks to Alexa, August Home will let you see who’s at the door, let them in
  • Google Home review
  • Want a smart home? Start with these best smart home devices
  • Google Home vs. Google Home Mini vs. Google Home Max: It’s all about the sound


31
May

Qualcomm’s new Snapdragon chip will take VR, AR to the next level


As expected, Qualcomm introduced the world’s first platform dedicated to extended reality (XR) with the launch of its Snapdragon XR1 chip and headset design. As previously reported, XR is an umbrella term used to group virtual reality, augmented reality, and mixed reality hardware and experiences. Companies already lined up to use the platform include Meta, HTC’s Vive brand, Vuzix, and Pico.  

Up until now, mobile-based VR and AR applications relied on a chip optimized for smartphones. The takeaway from Qualcomm’s announcement is that despite the Snapdragon brand, the XR1 chip wasn’t designed for smartphones and tablets. Instead, it’s optimized for extended reality experiences including augmented reality applications backed by artificial intelligence. 

Outside packing ARM-based CPU cores and a GPU unit, the new chip includes an A.I. engine for on-device processing. That means the resulting headset won’t require a tethered connection to a PC, nor will it need the cloud to process A.I.-based functions. Given extended reality experiences powered by A.I. require loads of processing, Qualcomm tuned the chip for high performance and power efficiency to prevent the parent device from quickly gobbling up the battery’s charge.

“Other key features include an advanced XR software service layer, machine learning, the Snapdragon XR Software Development Kit (SDK) and Qualcomm Technologies connectivity and security technologies,” Qualcomm says. 

The new platform can support 4K video at 60 frames per second, and the latest graphics APIs such as Vulkan, OpenGL, and OpenCL thanks to the integrated Spectra Image Signal Processor. It also supports visual-inertial odometry, a technology that allows you to interact with augmented reality objects and freely move around in the virtual world without cables. 

Manufacturers creating headsets based on the XR1 chip can implement three or six degrees of freedom head tracking. For example, six degrees of freedom means you can move up, down, left, right, forward and backward without the need for a tethered PC or external sensors. Three degrees is what you typically see on smartphone-based headsets that support head rolling (pivot up and down), pitching (look up and down), and yawing (look left and right). 

Qualcomm put a lot of work into the audio aspect, too. The platform relies on the company’s 3D Audio Suite, Aqstic Audio, and AptX Audio for not only high-quality sound but to provide voice assistance that is always an and always listening. Meanwhile, head-related transfer functions will give audio a 3D-like experience, locking down sound sources to specific points in virtual and real-world spaces no matter where you’re facing. 

In a separate announcement, Vuzix said its next-generation Blade smart glasses will be based on the Snapdragon XR1 platform. The combination of Qualcomm’s platform and Vuzix’s “proprietary waveguides and display engines” will create smaller, more fashionable devices. The Snapdragon XR1 will also be used in the next Vuzin M-Series smart glasses for the enterprise.

Products produced by Vuzix aren’t expected to appear until sometime in 2019. HTC, Pico, and Meta did not make any official announcements at the time of this publication.

Editors’ Recommendations

  • Qualcomm expected to reveal Snapdragon chip dedicated to VR, AR next month
  • Mozilla announces Firefox Reality, a browser for augmented and virtual reality
  • Apple working on depth-sensing technology that could show up in a future iPhone
  • Apple’s wireless, mixed-reality glasses could launch in 2020
  • Lenovo Mirage Solo with Daydream review


31
May

Reddit beats out Facebook to become the third-most-popular site on the web


Christian de Looper/Digital Trends

Reddit has now surpassed Facebook and is now the third-most-popular internet destination for users in the United States, according to rankings published by Amazon subsidiary Alexa (no, not that Alexa), a website that tracks and analyzes web traffic. Despite its recent controversial site redesign, this means that Reddit now trails Google and YouTube, but ranks ahead of Facebook and Amazon.

While Reddit still doesn’t attract the same amount of traffic as Google or YouTube, the good news for the site is that users spend more time browsing the site, averaging 15 minutes and 10 seconds every day, The Next Web reported. For comparison, users spend just 7 minutes and 16 seconds on Google, 8 minutes and 31 seconds on YouTube, 10 minutes and 50 seconds on Facebook, and 7 minutes and 37 seconds on Amazon. Reddit also outranks the top five sites on Alexa’s list with more daily page views per visitors. Rounding out the top 10 sites are Wikipedia, Yahoo, Twitter, eBay, and Netflix.

Reddit’s rise comes at a tumultuous time for Facebook, which has experienced a number of high-profile scandals over user privacy and data sharing in recent months. Most notably, the Cambridge Analytica scandal, impacting as many as 87 million Facebook users, has resulted in users calling for a boycott of Facebook over the company’s data collection and advertising practices. In response to the scandal, Facebook unveiled changes to make its site safer, and Europe’s General Data Protection Regulation, also known as GDPR, subsequently went into effect to help make the internet a safer place for users.

Reddit itself has also been a subject of controversy. Created by founder Alexis Ohanian, the site merged with Aaron Swartz’s Infogami in 2006. In 2011, Swartz was charged with federal hacking laws for downloading millions of academic articles from a subscription service, to which he was given a guest account through MIT. Two years later, Swartz was found dead in his apartment by suicide. Most recently, Reddit made headlines when it refused to ban hate speech on its site, arguing that free speech should be protected unless there is a threat of violence or harm. In a separate incident, Reddit was also the home where the personal images of celebrities were uploaded following an Apple iCloud attack.

Editors’ Recommendations

  • Reddit continues to protect racist language in favor of free speech
  • Which social media platform wins?
  • 9 things to know about Facebook privacy and Cambridge Analytica
  • Logging in with Facebook may let Javascript trackers steal personal data
  • After Warner Music deal, Facebook is in league with all three major labels


31
May

In Boston’s newest restaurant, all the chefs are robots


When you look at how many startups offer well-stocked cafeterias among their employee perks, it’s no great surprise to hear that people working in the tech industry love food. But how many love it enough to launch their own restaurant? At least four: Recent MIT graduates Brady Knight, Michael Farid, Luke Schlueter and Kale Rogers. They recently launched a new fast food restaurant called Spyce. Its hook? The fact that the entire kitchen is staffed by robots.

Now open for business in downtown Boston, Spyce offers a half dozen bowls of food in Latin, Mediterranean, and Asian styles. Prices start out at just $7.50 a bowl. The explanation for the budget conscious price tag is because using robots to prepare the meals saves on costs. These savings are then passed directly on to the customer for an experience that’s both high-tech and wallet-friendly.

“While we expected many people to come to the restaurant at first because of the novelty of the robot, the real benefit of our robotic kitchen comes from the quality of meals we are able to serve,” co-founder and lead electrical engineer Brady Knight told Digital Trends. “Being that our robot does the portioning and cooking, we can ensure the meals are being made consistently and accurately. Another advantage is that our technology allows employees to focus on creating more meaningful connections with our guests.”

Spyce

When customers enter Spyce, they are met with a human guide who shows them to a touchscreen kiosk where they can place their order. This order is then sent to the kitchen — which is visible to the customers — where the food is prepared by robots. Finally, it’s handed over to a human employee to add garnishes like cilantro or crumbled goat cheese, before being distributed to the customer.

Don’t worry about a lack of human chef expertise, though. The Spyce robots precisely execute recipes created by Sam Benson of the celebrated Café Boulud. The company also boasts Michelin-starred chef Daniel Boulud on its advisory board, having agreed to participate after seeing a demonstration of the robot in action.

“Running a restaurant is quite difficult,” Knight acknowledged. “It’s an industry of low margins, with high turnover rates, and little room for error. While I can’t speak to the industry as a whole, our technology has allowed us to deliver incredible meals for $7.50 and serve them consistently. We’re excited to be part of the industry and grow with it.”

Editors’ Recommendations

  • Hello Fresh vs. Blue Apron: Which meal-delivery service is better?
  • Flippy gets fired: Burger bot shut down after one day on the job
  • Flippy the burger-flipping robot is now working alongside humans at CaliBurger
  • Plated plans to take meal kits from your doorstep to the grocery store
  • Robot chefs are the focus of new Sony and Carnegie Mellon research