Skip to content

Archive for

6
Mar

Xiaomi plans to launch its smartphones in the U.S. by the end of 2018


Andy Boxall/Digital Trends

In February, Lei Jun — founder and CEO of Chinese technology company Xiaomi — revealed his plans for further global expansion. According to The Wall Street Journal, Xiaomi plans on entering the U.S. smartphone market as early as the end of 2018.

“We’ve always been considering entering the U.S. market. We plan to start entering the market by end 2018, or by early 2019,” Jun told the publication.

Xiaomi’s hardware is already available in more than 70 countries spread across Asia, Africa, and parts of Europe. At the end of 2017, it placed fourth in China — right behind Vivo, Huawei, and Oppo. Around the same time, Xiaomi also increased its phone shipments globally by 83 percent.

News of Xiaomi entering the U.S. market has been surfacing for a few years now. In 2016, it was rumored the company was testing its phones on U.S. networks, with plans on releasing the devices within a few years.

If the company ends up launching its handsets in North America, it is still unclear whether it will try and sell the smartphones through U.S. carriers or directly to its consumers through its website. As of now, you’re able to purchase items other than smartphones on the site — such as the Mi Bluetooth speaker, Mi Sphere Camera Kit, and more.

But breaking into the U.S. smartphone market hasn’t been an easy venture in the past, specifically for Chinese manufacturers. In January, Huawei announced its flagship smartphone would not be sold by U.S. carriers with rumors that it was supposedly due to security concerns.

Carriers such as AT&T and Verizon reportedly both gave into cutting ties with Huawei due to the pressure of U.S. lawmakers. According to reports, politicians advised that having any connection to either Huawei or China mobile could ruin their ability to do business with the U.S. government.

The issue stemmed primarily from a 2012 congressional report that suggested U.S. carriers should not only stay away from Huawei gear but ZTE as well. The reasoning behind why U.S. carriers should stay away was specifically because, “China might use it to spy on Americans.”

It’s still unclear whether Xiaomi will be faced with similar scrutiny when it comes to U.S. carriers. But the expansion to the U.S. will allow for the company to grow its presence in the country.

Editors’ Recommendations

  • Xiaomi CEO vows more global expansion and renewed assault on China market
  • U.S. lawmakers reportedly pressure AT&T to completely cut ties with Huawei
  • ZTE and Huawei respond to intelligence agency warnings over security risks
  • Amazon, Google trade punches over Nest smart home product sales
  • Microsoft and Xiaomi team up to produce smart speakers, laptops, and more


6
Mar

ModiFace replaces makeup brushes with neural networks, and it’s coming to the S9


Brenda Stolyar/Digital Trends

Samsung’s Galaxy S9 and S9 Plus come with a variety of new camera features including AR Emoji, Super Slow Motion, and a variable aperture. But there’s one more new trick that we’ve never seen built into a smartphone before – the augmented reality makeup tool in Bixby Vision.

Baked into the S9’s camera app, Makeup lets you apply different products to your face with augmented reality. Point the selfie camera at your face, and the makeup styles you pick are layered over your face, exactly like Snapchat filters. The benefit is you’re trying on makeup in the comfort of your own home, while also avoiding the need to clean up afterwards. It’s the first time we’ve seen this kind of feature built into a smartphone, but the technology is far from new. Samsung tapped ModiFace, a company with more than a decade of research in this field, to integrate its beauty AR technology into the camera.

From skincare to augmented reality makeup

If you’ve ever used the Sephora app to virtually try-on makeup, or Benefit’s Brow Try-On app to test out a new eyebrow shape, then you’ve used ModiFace’s AR technology. The company now powers more than 200 custom augmented reality apps for high-end beauty brands, but it all initially started with skincare.

Smartphone popularity inspired ModiFace to test its technology with augmented reality.

Before the smartphone revolution, ModiFace worked with dermatologists and certain brands to help people find solutions for skincare concerns.

People could upload a photo of themselves to a web app, and the company would process it to pinpoint problem areas and suggest improvements to skincare routines.

ModiFace originally used 2D images with the web app, but as technology progressed, the company was soon able to move to allowing users to upload video. When smartphones got popular, ModiFace began testing its technology with augmented reality, and added makeup and hair to its roster of supported features.

“With the smartphone boom, it was the perfect opportunity for us to expand because it allowed us to bring AR right to our users in the palm of [their] hand,” Jeff Houghton, ModiFace vice president of Technology, told Digital Trends.

Its current software development kits (SDKs) are the culmination of 10 years of engineering. By working closely with beauty brands, ModiFace is able to provide a lightning fast and easy-to-follow user experience, while also encouraging product discoverability.

ModiFace’s technology powers more than 200 augmented reality apps for high-end beauty brands, including numerous smart mirrors, that allow customers virtually try on makeup in real-time. Photo Credit: ModiFace.

Its technology doesn’t stop at smartphone apps, extending to retail stores as well. Back in November, MAC Cosmetics debuted its MAC Virtual Try-On Mirror at certain locations. The mirror lets customers virtually try on makeup in real-time by swiping throughout the interface. It helps eliminate the need to test a lot of products at the store, narrowing it down to the styles you like the most.

Putting its tech on the Galaxy S9

The same technology and concept is what’s available on the new Galaxy S9. By collaborating with Samsung, ModiFace was able to optimize the experience.

The partnership
could also kick start a
new trend for phone manufacturers.

“Starting in house, we train our Neural Networks on thousands of images to create the base tracking and face analysis for our apps,” Houghton said. “This Neural Network is then embedded inside Samsung’s app. We worked with Samsung to tweak several parameters to make sure we were achieving the effects that brands and end users want to see.”

The partnership between the two companies could kick start a new trend for phone manufacturers. While beauty technology is still in its infancy, makeup has found its way on to our screens for years through social media. Whether it’s YouTube or Instagram tutorials, people are constantly looking to their smartphones to find new inspiration when it comes to makeup. With the Galaxy S9, all you need to do is swipe open the camera and you have a catalog of makeup at your disposal.

On the Galaxy S9, you can search through a variety of different cosmetic products available from both Sephora or Cover Girl, and you can purchase them on the spot. As you scroll through each product, it will apply itself to your face like Snapchat filters.

Using facial tracking and 3D video rendering, the makeup filters are mapped to the face at 30 frames per second. There’s no lag, and nothing is misaligned. It works as instantly as you to tap the next product you want to try on. You can try on complete looks – which includes lipstick, foundation, eyeshadow, blush, mascara and eyeliner – or you can try them all separately.

Trying on foundation can be extremely tricky, almost always requires being at the store and having an employee help you find your exact shade. There’s also different foundation types to take into account – matte, sheer, water-based, or ones that provide full coverage. To make sure the shades are as accurate as possible, the ModiFace team tediously went through each product and compared how it looked in real life versus how it looked on the smartphone. They worked with both Samsung and the brands directly to create a matching render.

“I myself was actually involved with that process and it’s kind of a lot of fun,” Houghton said. “You get to try on a lot of different things, and then you get to wipe them off and see how they look in the app. It involves a lot of screenshot comparison and manual work to make sure that the product we’ve included is just perfect.”

The team went through each product and compared how it looked in real life
versus how it looked
on the smartphone.

But there’s still always room for improvement. Using deep learning and image research, the team studies internal data — gathered from those who are testing the final product — to learn more about a user’s face in order to train its algorithms. This helps to produce better results when it comes to factors like how colors mix on to your lip, how light gets added into those colors, and how your face should look under different lighting if you have a specific foundation on. The data also helps to improve texture and overall coverage of the makeup.

With this kind of deep learning, the company is able to create new and more realistic effects. Recently, ModiFace partnered with L’Oreal to create 3D hair tracking that can recolor your hair in real time. The company has plans to work with Samsung to bring similar effects to its phones.

But for those with privacy concerns, Modiface doesn’t collect or monitor any data from its applications when in the hands of users. Any images you take with the apps will never leave your device.

Editors’ Recommendations

  • 5 features you may not have heard about on the Samsung Galaxy S9
  • The Samsung Galaxy S9 is finally here: Here’s everything you need to know
  • Samsung Galaxy S9 hands-on review
  • Flagship face-off: Samsung’s Galaxy S9 Plus vs. Google’s Pixel 2 XL
  • The best Samsung Galaxy S9 Plus cases to keep your titan safe


6
Mar

The MacBook Air doesn’t need a price cut, it needs a redesign


A new report from a reliable source has indicated Apple is looking to release a cheaper MacBook Air later this year. Obviously, we don’t know the specifics yet, but it’s an interesting report, especially when compared to the January report that the MacBook Air was going to be replaced later this year.

It could come in the form of a dramatic price cut of the 12-inch laptop, or in a complete redesign of the MacBook Air — either way, it’s time for something new in the entry-level market, not just a price cut. Apple has been riding the wave of the MacBook Air for years, selling it on name recognition alone. According to Apple, it was meant to be discontinued a long time ago, but has stayed in circulation due to how well it sold. An update has been a long time coming.

Regardless of the direction of this new laptop, there are a few things Apple has to do to fix the problem. Here are the most important — a fresh 1080p display, an 8th-gen Intel processor, and some Thunderbolt 3 ports.

.@Apple might have a cheaper update or replacement to the #MacbookAir coming out this year. What feature would be a must-have for you?

RT for a larger sample size!

— Digital Trends (@DigitalTrends) March 5, 2018

According to the results from our poll, people are looking for something new in the MacBook Air, not just a price cut. But the prospect of Apple continuing to sell the current MacBook Air at a cheaper price is definitely possible. After all, it’s been doing it for years.

Selling an outdated laptop at a cheaper price isn’t the same as designing a product for that specific price point. It’d be the equivalent of Apple selling the iPhone SE — a phone that looked like the iPhone 4 — without completely updating the internals when it came out. Apple’s been doing this for years with the MacBook Air. Not cool.

The cheapest MacBook will always sell well, even with an old design and outdated internals. Giving it another price cut is just taking advantage of that fact, rather than building something innovative for the entry-level market, like the original MacBook. It doesn’t need a Touch Bar and all the fanciest new features. It just needs to be a solid computer at a decent price. If Apple pull that off, we’ll consider it a great budget laptop again.

Editors’ Recommendations

  • How does it work? Everything you need to know about the Apple AirPlay 2
  • Tesla Model S news roundup: All you need to know about the world-class EV
  • The LG V30: Everything you need to know about LG’s flagship smartphone
  • What’s ATSC 3.0? All you need to know about the next era in broadcast TV
  • 65 inches. 4K. 120Hz. Nvidia’s BFGD is all the monitor you will ever need


6
Mar

The best computer speakers


The built-in speakers lining your laptop or monitor are rarely worthy of praise. With those chintzy stock speakers, you’re missing out on all the nuance that goes into the creation of your favorite songs, film scores, and, of course, Overwatch taunts. A quality speaker system is essential if you want to make the most of your media.

With that in mind, we’ve put together a list of the best computer speakers on the market. We can’t guarantee they’ll make Bastion’s fanfare any less infuriating, but at least it will sound better.

The Best

Aperion Allaire ($400)

The Allaire offers the most complete combination of high-end sound quality, versatility, and connectivity in the genre. With Bluetooth, a digital optical input, an analog input, a subwoofer output, and a USB port for charging devices, the Allaire is suitable for almost any installation. It includes a 1-inch dome tweeter and a 4-inch woven fiberglass woofer. However, what really sells us on this set is its outstanding sound quality, which offers rich bass response, pure midrange details, and pristine treble. For a desktop that often doubles as an entertainment center, there is no better choice at this price point.

Read our full Aperion Allaire review

Buy them now from:

Aperion Audio

The Rest

Edifier e25 Luna Eclipse ($184)

The ultra-modern e25 Luna Eclipse is as outstanding in terms of build quality as it is in sound. The slick, egg-shaped speakers produce a clear, open midrange and a surprising amount of bass via the integrated 3.5-inch drivers and dual passive radiators. They offer phenomenal sound for the price, with a bevy of flashy features to boot – such as an auxiliary port for connecting to additional devices.

Read our full Edifier e25 Luna Eclipse review

Buy them now from:

Amazon

Logitech Z337 ($100)

The overall output of Logitech’s Z337 speaker kit is rather decent for a desktop-oriented system, providing 40 watts of quality sound without rattling the walls: a pair of eight-watt satellite speakers and a 24-watt subwoofer pumping deep bass at your feet. But the kit isn’t just for your desktop or laptop: it’s a great setup for your Bluetooth device too (v4.1 and newer). Other connectivity options include a 3.5mm audio jack, one RCA input, and a headphone jack. Adding to the bundle is a wired audio dial for pairing Bluetooth devices to the system, adjusting the volume, and switching off the power. The subwoofer has its own-built in control dial to manually adjust the bass. 

Buy them now from:

Amazon Logitech

Audioengine A2+ ($249)

Audioengine’s A2+ are a simple and sweet speaker upgrade for your computer or related devices. The dynamic bookshelf speakers bask in clear accuracy and nuanced undertones that span the frequency range, while offering an integrated digital-to-analog converter that lets you forego your computer’s analog output in favor of a purer signal. These speakers may lack a bit in the low-end, but they shine with versatility.

Buy them now from:

Amazon

Bose Companion 5 ($399)

Bose has a certain reputation for high-priced speakers that can easily fill a room with swelling sound. In this particular case, that reputation is well deserved. This pricey speaker set (a subwoofer, plus drivers perched on stands to better broadcast their sound) is perfect for supplying office music or using computer speakers as your primary house music system. They are also compatible with additional audio devices with an included USB cable, as well as Bose Bluetooth adapters. If you are looking for a top line speaker set that’s extra low on maintenance, this makes a great choice.

Buy them now from:

Amazon Bose

Creative GigaWorks T20 Series II ($100) 

Version 2 of Creative’s GigaWorks T20 Series kit consists of two 14-watt speakers each with a dedicated cloth-domed tweeter and a larger mid-range driver. There’s no included subwoofer, but instead each speaker consists of what Creative calls BasXPort: proprietary technology that pushes sound waves from the inner chamber up through an opening at the top of each speaker to produce better, natural mid-range audio. But you can still control your bass level through a dedicated knob found on the right speaker along with treble and volume controls. The right speaker even provides a headphone jack and a 3.5mm auxiliary jack for audio input. 

Buy them now from: 

Amazon Creative 

 Update: Adjusted to fine-tune the list.

Editors’ Recommendations

  • The 10 best windproof umbrella deals to help you weather the storm
  • Best car stereos
  • Our top 5 portable projector deals let you enjoy big-screen video without a TV
  • The best computer cases you can buy
  • Running is easier with music in your ears: 7 of the best iPhone armbands


6
Mar

Asus ZenBook Flip 14 vs. Lenovo Yoga 920


Sitting right above the 13.3-inch convertible 2-in-1 are a few 14-inch machines that provide slightly larger displays without adding too much to chassis size. Lenovo’s Yoga 920 is a classic example, and one of the most robustly built. Asus has just introduced another option, the ZenBook Flip 14, that enjoys the distinction of building in a bone fide discrete GPU.

The Lenovo Yoga line has long been one of our favorite among the premium brands, and the Yoga 920 represents refined version. We pit the Asus ZenBook Flip 14 vs. the Lenovo Yoga 920 to see if discrete graphics is enough to win over one of the best around.

Asus ZenBook Flip 14

Lenovo Yoga 920

Dimensions
12.89 x 8.92 x 0.55 in
12.7 x 8.8 x 0.50 in

Weight
3.31 pounds
3.02 pounds

Keyboard
Full-size backlit keyboard
Full-size backlit keyboard

Processor
Up to eighth-generation Intel Core i7
Up to eighth-generation Intel Core i7

RAM
8GB or 16GB RAM
8GB or 16GB

Graphics
Intel UHD 620

Nvidia GeForce MX150

Intel UHD 620

Display
14-inch IPS display
13.9-inch IPS display

Resolution
Full HD (1,920 x 1,080 or 157 PPI)
Full HD (1,920 x 1,080 or 158 PPI)

Storage
Up to 512GB PCIe SSD
Up to 1TB PCIe SSD

Networking
802.11ac, Bluetooth 4.1
802.11ac, Bluetooth 4.1

Connectivity
USB-A 3.0 (x2), USB-C 3.1, HDMI, microSD card reader, 3.5mm combo jack
USB-A 3.0, USB-C with Thunderbolt 3, 3.5mm combo jack

Windows Hello
Fingerprint scanner
Fingerprint scanner

Operating System
Windows 10
Windows 10

Battery
57 watt-hour
70 watt-hour

Price
$900
$1,330+

Availability
Now
Now

Review
3.5 out of 5 stars
4.5 out of 5 stars

Design

The ZenBook Flip 14 shares the same basic aesthetic as the rest of the ZenBook line, with the iconic Asus concentric circle design adorning the lid, and a Slate Gray color scheme that’s rather conservative. It does little to stand out in a crowd. In terms of build quality, we found it to offered a mostly rigid chassis that’s just the slightest bit flexible  in the lid and keyboard deck. The rather stiff hinge required two hands to open, but kept the display firmly in place.

The Lenovo Yoga 920 was refreshed late in 2017, and it adopted the same basic Yoga look with some subtle changes. There are some additional angles providing a slightly cleaner overall appearance, and three color schemes (Platinum, Bronze, and Copper) are available. The Yoga 920 is relatively thin (0.50 inches compared to the ZenBook’s 0.55 inches) and light (3.02 pounds versus 3.31 pounds), with smaller display bezels for a smaller chassis overall. Most significantly, the Yoga 920 is built like a tank, with not a single surface bending under pressure, and the hinge both looks more modern. It’s smooth and effective while moving through the usual 2-in-1 modes.

Of all the notebooks we’ve reviewed lately that have shaved off some girth, the Yoga 920 has managed to hold onto its impressively solid build. While it’s also somewhat conservative, it’s a much more attractive machine than the ZenBook Flip 14.

Winner: Lenovo Yoga 920

Performance

Both the Zenbook Flip 14 and the Yoga 920 sport Intel’s latest and greatest mobile CPUs, the powerful and efficient eighth-generation Intel Core series. Our review units were equipped with the Intel Core i7-8550U, a chip that packs in four cores, doubling the previous generation. The CPU manages clock speeds to provide awesome performance when needed, and great efficiency when battery life is paramount.

While Asus and Lenovo both squeeze out good performance from the CPUs, the Yoga 920 was just slightly faster in terms of its processor performance. But the ZenBook Flip 14 has a trick up its sleave — it incorporates a discrete GPU, the low-end Nvidia GeForce MX150. This isn’t a hardcore gaming GPU but rather one that’s suited for older titles and eSports gaming, but it’s significantly faster than the usual integrated Intel graphics inside the Yoga 920.

If you want to do some light gaming on your convertible 2-in-1, or to encode some video, then you’ll appreciate the ZenBook Flip 14’s mobile graphics performance.

Winner: Asus ZenBook Flip 14

Keyboard, Mouse, and Pen

Mark Coppock/Digital Trends

In our review, we weren’t terribly impressed by the ZenBook Flip 14’s keyboard, finding that while it offered decent key travel, the bottoming action is much too abrupt and makes typing less comfortable and precise. The touchpad was good, however, with Microsoft Precision touchpad support and a large surface area — although some of it is taken up by the Windows Hello-supporting fingerprint scanner. Asus includes its active pen with the ZenBook, but the machine’s display is oddly sticky and makes inking an imprecise affair.

The Yoga 920’s keyboard is of typical Lenovo quality, on the other hand. While we found the keys a bit stiff, the good travel and distinct tactile feedback made for a precise typing experience. The Microsoft Precision touchpad was less responsive than we like, though, with some issues using gestures. The Lenovo active pen was another standout, offering four times the pressure sensitivity (4,096 versus 1,024), and with a display that provides a more suitable surface, and thus a significantly better inking experience. The Yoga 920 also uses a fingerprint scanner for Windows Hello support.

Although the ZenBook Flip 14’s touchpad was better, the Yoga 920’s keyboard and pen are vast improvements. They combined to give Lenovo the win in this round.

Winner: Lenovo Yoga 920

Connectivity

Asus focused mostly on providing legacy connectivity support on the ZenBook Flip 14, with two USB-A 3.0 ports and a full-size HDMI port to go with the single USB-C 3.1 port. That’s in addition to the microSD card slot, 2×2 MU-MIMO Wi-Fi, and Bluetooth. Most notably, Asus did not equip its 2-in-1 with Thunderbolt 3, which limits display support and negates the option of an external GPU enclosure.

Lenovo, on the other hand, looked to both the past and the present with the Yoga 920, equipping a USB-A 3.0 port to go with two Thunderbolt 3-supporting USB-C ports. There’s also an SD card reader and the usual Wi-Fi and Bluetooth connectivity.

The Yoga 920 might not have a discrete GPU, but its Thunderbolt 3 support means you can plug in an external GPU enclosure for even faster gaming than the ZenBook Flip 14 as long as you’re sitting still. That gives Lenovo another win.

Winner: Lenovo Yoga 920

Display

Mark Coppock/Digital Trends

We found the ZenBook Flip 14’s 14-inch Full HD (1,920 x 1,080 or 157 PPI) display to be slightly less than average for the premium 2-in-1 class. Brightness was particularly low, and contrast and color gamut were all at the low end of the range. It was a decent enough experience subjectively, as are even average premium displays today, but its objective qualities leave something to be desired.

The Yoga 920’s 13.9-inch Full HD (158 PPI) display was quite a bit better. First, brightness was much higher, exceeding the 300 nit baseline we like to see, and its contrast and color support were superior. Perhaps most important, Lenovo also offers a 4K UHD (3,840 x 2,160 or 317 PPI) option for the Yoga 920, something that Asus is not offering for the ZenBook.

Not only is the Yoga 920’s entry-level display better, but you can step up to a much sharper 4K option as well. That’s a real win for Lenovo.

Winner: Lenovo Yoga 920

Portability and Battery Life

Intel’s eighth-generation CPUs are very efficient when they’re working on the typical productivity task. They’ll use more battery if you doing more intensive tasks like encoding video, but Office work, web browsing, and watching movies will barely move the battery meter.

In our testing, the Yoga 920 and ZenBook Flip 14 both burned the battery at close to the same clip when performing more CPU intensive tasks, coming within a few minutes of each other running our intensive Basemark test and looping through a series of web pages. The Yoga 920, however, last almost four hours longer when playing a local video, meaning that its going to last much longer when you’re consuming media.

Both 2-in-1s will likely get you through most of a day’s work without needing to plug in, but you can’t ignore the Yoga 920’s power-sipping ways when watching movies and TV shows.

Winner: Lenovo Yoga 920

Availability and Price

Both the Yoga 920 and the ZenBook Flip 14 can be equipped with configurations that lie firmly in the premium notebook space. Our review ZenBook was built with a Core i7-8550U, 16GB of RAM, and a 512GB SSD for a retail price of $1,300. This compares to $1,600 for a similarly equipped Yoga 920. You can also pick up a budget configuration of the ZenBook for $900, with a Core i5-8250U, 8GB of RAM, and 256GB SSD. The equivalent Yoga 920 runs a much more expensive and premium $1,200.

We have to give some kudos to Lenovo for offering a 4K display option, which adds $200 to the price, along with the option of a 1TB SSD that Asus omits. But the ZenBook Flip 14 is clearly the more frugal machine, and in fact its budget option is a great alternative for anyone looking to save some money.

Winner: Asus ZenBook Flip 14

The Yoga 920 is  a more refined machine

The ZenBook Flip 14 is a two-trick pony in this particular comparison. It has faster graphics allowing for some entry-level gaming, and it’s less expensive. Those are good qualities to have.

The Yoga 920, however, is a more refined convertible 2-in-1, with class-leading build quality, a conservative yet modern aesthetic, and better processor performance and battery life. If you really want to run Rocket League while on the road, then the ZenBook Flip 14 is a good option. For everyone else, the Yoga 920’s overall design wins out — and if you insist on gaming, then put a real GPU into an enclosure and plug it into the Thunderbolt 3 port. That’s something the ZenBook just can’t support.

Editors’ Recommendations

  • Asus Zenbook Flip 14 UX461UN review
  • Asus ZenBook Flip 14 vs. Microsoft Surface Book 2 13
  • Asus ZenBook 13 vs. HP Spectre 13
  • HP Spectre x360 13 (Late 2017) Review
  • HP Spectre 13 (2017) review


6
Mar

The top trends we saw at Mobile World Congress 2018


Mobile World Congress 2018 has come and gone, and while one of the biggest smartphones of the year launched at the trade show in Barcelona, there’s still a lot of information to gleam on what we can expect with smartphones for the rest of the year.

From the continuing bezel-less trend and the return of the notch, to artificial intelligence and ever-improving budget phones, here are the top trends we noticed at MWC.

Bezel-less trend is here to stay

Julian Chokkattu/Digital Trends

The bezel-less trend, where manufacturers trim the edges around a smartphone screen, is far from over. It all began with the Xiaomi Mi Mix in 2016, and now even Apple has embraced it, increasing the screen size of the iPhone X while cutting the bezels around the screen. The benefit is using a smaller, narrower phone, while getting more screen real estate. For example, Apple’s iPhone X is smaller than the iPhone 8 Plus, but it has a bigger screen.

Narrower devices also mean phone-makers are adopting the 18:9 aspect ratio over the traditional 16:9 ratio, which means you’ll see more content in vertical-scrolling apps, as well as have more space for multi-tasking with split-screen mode. Even budget phones like Alcatel’s new 5 series are adopting the aspect ratio, and that means we can expect the bezel-less trend and the 18:9 aspect ratio to trickle down to more budget phones this year. Hardly surprising, since flagship features are always quick to jump down to affordable devices.

Copying the notch

Asus ZenFone 5 (Simon Hill/Digital Trends)

But there’s one more display-related trend that won’t be going away soon: The notch. If you take a look at the iPhone X, you’ll see a notch-like cutout sticking into the screen at the top. This houses the TrueDepth cameras and sensors that help make Face ID and Animojis work. The notch hasn’t exactly been well-received, but at least it’s packed with plenty of tech powering a bunch of new features. At MWC, it’s evident smartphone manufacturers want to copy the notch design, without adding any nifty features. Instead of using a black stripe like Samsung, device-makers are adding notches to their phones for the sake of copying Apple. We’ve rounded up a few, including Asus with the ZenFone 5, but rumors point to Huawei imitating Apple with the upcoming P20 as well. These notches don’t add much value other than housing the front-facing camera.

Thankfully, some manufacturers are toying around with different designs. Vivo, for example, has a camera protruding from the top of the phone, and Xiaomi put the front-facing camera on a single, bottom bezel. Either way, expect more phones with notches this year until someone figures out how to make a truly bezel-less smartphone.

Everyone’s working on artificial intelligence

Julian Chokkattu/Digital Trends

LG hardly had a presence at Mobile World Congress, and its only major announcement was a new version of the well-received LG V30. The LG V30S ThinQ has more RAM and more internal storage, but it’s also packed with artificial intelligence. LG’s A.I. improvements come to the camera app specifically, where it can identify scenes and apply the right colors to make a photograph pop. These additions are hardly a reason to buy the phone, but it demonstrates that LG is serious about developing and improving its own A.I. Samsung is also going strong with Bixby, introducing a handful of new camera-related A.I. features to Bixby Vision. A.I. was on the tip of everyone’s tongue, but we’ve yet to see any game-changing improvements like what Google and Apple have already integrated into their respective operating systems. This year, expect to see every phone launching with some type of highlight A.I. feature meant to improve your smartphone experience. Whether it will actually improve anything remains to be seen.

AR still needs a killer app

Brenda Stolyar/Digital Trends

Like A.I., augmented reality also was a must-have feature at MWC. ZTE’s Blade V9 smartphone has some basic augmented-reality stickers baked into the camera app, but it’s Samsung’s Galaxy S9 that took the spotlight with AR Emojis. Similar to Apple’s Animoji, you can create your own emoji, and use motion tracking to record a video. It doesn’t work as well as Animoji, but it goes a step further by creating a sticker pack based on your likeness. Some of Samsung’s other AR improvements include a makeup tool built into the camera app, allowing people to layer styles on their face and make purchases directly in the app. While both AR Emojis and Makeup are neat additions, we’ve yet to find a game-changing AR app or feature we want to continuously use. We imagine every camera app on every smartphone in 2018 will have a separate category dedicated for augmented reality.

Goodbye wearables

Julian Chokkattu/Digital Trends

Last year, Huawei launched the Huawei Watch 2 at MWC. In 2018, not a single major manufacturer announced a smartwatch. Yes, there were companies present showing off wearables, but other than the assortment of fitness trackers, hybrid watches, or quirky wearables, we hardly saw any innovation in this category. Wearables aren’t dead, as evidenced by Fitbit and Apple, but competitors still aren’t offering compelling reasons to buy them. Google specifically mentioned improvements to Android Wear will come at its developer conference this year, but we’re not expecting any major developments this year in this category.

Budget and feature phones get better

Julian Chokkattu/Digital Trends

We say this every year, but budget phones are getting even better. Like bezel-less displays, tech found in high-end smartphones quickly trickle downs to their affordable counterparts. HMD Global’s $345 Nokia 6 is a good example — it’s packed with a lot of tech, and the build quality is excellent. But at MWC this year, Google launched a slew of Android Go smartphones. Android Go is a version of Android for phones with very little RAM and storage, as the core operating system takes up very little space. These phones cost under $100, like the Nokia 1, and they are capable of offering a solid smartphone experience for very little money. We think they’ll prove quite popular, and there are plenty of other budget smartphones to come that are sure to make us question the price tag.

Dumb phones are also seeing some improvement. A company called KaiOS is quickly becoming the operating system of choice for feature phones, such as the Nokia 8110 4G. Before you scoff — there are 1.3 billion feature phone users in the world, and it’s estimated about 600 million of these devices will be sold every year for the next 5 years. Dumb phones are significantly cheaper than touchscreen smartphones, and they often require very little data. KaiOS is bringing smart functionality and apps to these phones, like Google Maps, Google Assistant, and even Facebook, in an effort to bridge the digital divide.

5G looms

Maskot/Getty Images

You couldn’t walk anywhere at MWC without seeing some type of advertisement or promotion for 5G technology. The fifth generation network aims to bring super-fast internet connectivity speeds, which provides benefits for a variety of services and industries. We spoke to every major U.S. carrier for an update on 5G, and the consensus is we’ll see the technology deployed in various cities by the end of this year. That still doesn’t mean you’ll get to use it until 2019, which is when we expect to see the first smartphone capable of utilizing the network. Everyone’s racing to be the first 5G carrier, or the first smartphone that supports 5G. Expect a lot of talk about download and upload speeds at major smartphone press conferences this year.

Editors’ Recommendations

  • The LG V30: Everything you need to know about LG’s flagship smartphone
  • LG unveils two patents for foldable smartphones
  • Comparing smartphones to find the most bezel-less design
  • All phones in Alcatel’s 2018 budget lineup feature an 18:9 display
  • Here are all the Nokia phones HMD Global unveiled at MWC 2018


6
Mar

Thieves heist 600 PCs built for digital coin mining in Iceland


It’s a scenario ripped straight out of the movies: Thieves invaded data centers peppered across Iceland to steal around 600 computers used to mine digital currency. Dubbed as the “Big Bitcoin Heist” by the local media, police are baffled, stating that the series of thefts is the biggest they have ever seen. Even more, the total value of the theft averages around $2 million or just over $3,300 per machine. 

But don’t let the report fool you. The heist wasn’t one large massive invasion of Iceland’s data centers during a specific night. Three burglaries took place in December while a fourth followed in January. The police waited until the fourth heist and the arrest of 11 individuals before going public. Only two still remain in custody. 

“This is a grand theft on a scale unseen before,” Olafur Helgi Kjartansson, the police commissioner on the southwestern Reykjanes Peninsula, told the Associated Press. “Everything points to this being a highly organized crime.” 

So why Iceland? According to reports, its geothermal and hydroelectric power plants provide cheap, renewable energy needed to power farms of PCs mining digital currency. The “mining” aspect simply means these PCs help maintain the actual digital currency platform, whether its Bitcoin or Ethereum, and receive digital coins in return. Right now, a single Bitcoin is worth $11,561 in North America. 

And that is the fuel behind the theft. Digital currency is not maintained by any one government, nor can it be traced. With 600 computers, these thieves could generate millions in cash without a trace. But there’s a drawback: power. PCs need power to mine digital coins, thus the local police are keeping a close eye across Iceland for large amounts of power consumption in hopes of catching the thieves red-handed. 

But using all 600 PCs in one centralized location will likely never happen. The fishnet will need to be wider than just catching thieves using loads of power. Iceland’s law officials are now calling on storage space unit providers, electricians, and internet service providers to keep tabs on large pockets of PCs leeching large amounts of power and bandwidth. 

With Bitcoins, miners can’t just dig up a single coin and exchange it for cash. Instead, potential investors mine a Bitcoin block that generates a little more than 12 digital coins. Smaller miners typically pool their PCs together and receive one Bitcoin for contributing one-twenty-fifth of the computing power to generate a block. The catch is that the next block requires more computing power to mine at the speed of the previous block, pushing miners to add more hardware or steal farms of dedicated PCs as seen in Iceland. 

Cryptocurrency mining of today really isn’t meant for one specific PC to process. Because the digital currency relies on cryptography, the PC needs to make multiple, intense hash operations each second. The more powerful the hardware, the more operations it can make. A single PC wanting to mine a single block in a single month needs to perform around 3 quadrillion hash operations per second. That is why multiple, networked PCs with dedicated mining hardware is a must. 

Editors’ Recommendations

  • What is a blockchain? Here’s everything you need to know
  • Staff at Russian nuclear facility caught using supercomputer to mine Bitcoins
  • What is Litecoin? Here’s everything you need to know
  • What is bitcoin? Here’s what you need to know about it
  • Giant cryptocurrency mine that runs on green energy coming to Iceland


6
Mar

From classic to cloud: How I learned to love Lightroom CC


Daven Mathies/Digital Trends

I was there when Adobe showed off a completely redesigned version of Lightroom CC last year at Adobe MAX. Between the cheers and the applause, beneath palpable excitement of 12,000 creatives in attendance, I found myself feeling just one thing in the wake of the announcement: Relief. I didn’t care about new features or capabilities, I just stared into that simplified, minimalist, matte gray interface like I was watching the sun rise after a cold night.

As a photographer, I had never enjoyed working in the original Lightroom (now called Lightroom Classic). I found it to be a headache-inducing program that was painful as it was powerful. It ran like an antique car, but without the spit and polish that would have at least made it nice to look at. It was an artifact of the PC age, when beige was an appropriate color for consumer tech products.

The new app took the Lightroom name; the original received a tailpiece that destined it for where legacy software goes to die.

It was time to evolve, and here before me, finally, was a new Lightroom, completely rebuilt from the ground up. Here was the modern user experience photographers had long deserved. Gone was the module-based interface that tried to force users into a linear workflow — and with it, gone were five modules that I never used at all: Map, Book, Slideshow, Print, and Web. The Develop and Library modules were merged into one, and you could begin editing a photo simply by clicking (or tapping) on the editing controls button, without having to reload the image in a new module. It was glorious.

Sure, other photographers no doubt had uses for those other modules, but Lightroom Classic was foremost an image organization and editing tool — one that had grown fat and slow in its old age. Lightroom CC, by comparison, looked modern and streamlined, with a renewed focus on the simple thing that gave it reason to exist at all: Your photographs.

Adobe continues to update Lightroom Classic, but there was a general feeling that day at MAX that the company was gently trying to corral users toward Lightroom CC. After all, it was the new app that took the Lightroom name; the old version received a tailpiece that all but destined it for wherever legacy software goes to die. Why not just call it Lightroom Jurassic? Recycle its binary bones into fuel for software better adapted to the post-PC era. Even Adobe’s Bryan O’Neil Hughes proudly proclaimed in his presentation that he had made the switch a year and half earlier, and hadn’t looked back. If one of the most experienced Lightroom professionals on the planet had been living happily through buggy, pre-release versions of the new software for that long, then certainly it would work for me.

Daven Mathies/Digital Trends

Naturally, I downloaded the app as soon as I possibly could and started working in it that afternoon from my hotel room. But while I instantly loved aspects of it, I quickly found it lacked features that were indispensable. Disheartened, back to Classic I went — and that’s where I stayed for the next several months.

Fortunately, Adobe scrambled to bring new features to the app over that time, and Lightroom CC has grown into a competent photo editor. I finally decided to try making the switch again, and am pleased to report that I, too, haven’t looked back — even if, at times, I have to force myself not to. If you haven’t made the switch yet, it’s time to at least take a look.

Get your head in the cloud

The first thing to understand about Lightroom CC versus Lightroom Classic is that it presents an entirely new workflow paradigm. The interface is unified (as much as can be) across desktop and mobile platforms, and just about everything — even your RAW files — are automatically backed up to the cloud and accessible from anywhere. You can start an edit on your phone while in the field and finish it from your computer at home without skipping a beat. Adobe demonstrated this live, jumping between an Apple iPhone 8, iPad Pro, and a Microsoft Surface Book 2.

But the thing about product demonstrations is that they only show the awesome parts of something — not the unbearable parts. RAW files take up a lot of space, and if you’re any type of working professional, it’s easy to come back from a shoot with gigabytes upon gigabytes of images. Most internet service providers offer internet service built for the consumption, rather than the creation, of content, with upload speeds that are often many times slower than download speeds. In dense cities, you may have a better option — but in the rural small town where I live and work, I have to deal with an upload speed of just 4 megabits per second.

At 4Mbps, those 300 photos would take 5 hours to upload to the cloud.

On a recent camera review, I shot some 300 photos which amounted to a little under 10 gigabytes of data — not a large shoot, by any means. But at 4Mbps, those 300 photos would take 5 hours to upload to the cloud. That’s five hours before I can use them (at least, all of them) on another device, five hours before Adobe’s AI-powered Sensei search works, and five hours before I can dependably log in to a lag-free game of Destiny 2.

Now, imagine coming back from something like a wedding with not 300, but 3,000 photos? I’ll let you do the math.

Internet slow enough to stifle a fax machine certainly isn’t Adobe’s fault, but it’s something to be aware of before you go all in on a cloud-based workflow. Toting around a portable hard drive and manually syncing Lightroom Classic catalogs between your laptop and desktop — as tedious as it can be — may be the more efficient solution for some users, so long as you don’t care about having RAW files accessible on your mobile device.

Unfortunately, even if you wanted to, you can’t really use Lightroom CC this way. When it comes to file management, you’re more or less stuck in the cloud. Makes sense: Adobe wants to sign you up for paid cloud storage plans, after all. The entire concept of a photo “catalog” has vanished. You can still create albums to separate projects, but Lightroom CC now keeps all of your photos under one umbrella. This isn’t inherently bad — and may be the way most people used Lightroom Classic in the first place — but I prefer to create new catalogs for different projects, or at least project categories, to keep things nimble and organized. I have no need to see product photos from a review shoot alongside portraits from a wedding job.

Daven Mathies/Digital Trends

Lightroom CC’s import and export options are also woefully limited (there aren’t even keyboard shortcuts to bring up the import and export windows). You can’t add any metadata on import, and the only filetypes available on export are original or JPEG. And for the latter, the only control you have is setting the long dimension; there is no ability to set the amount of JPEG compression. You can’t even choose to name and sequence the files on export.

Moreover, not that this exactly matters given the lack of options, but export presets are completely gone. This is bad news for me, as I used various presets in Lightroom Classic for different purposes, from outputting photos sized to Digital Trends’ standards, to highly compressed files for social, to full resolution images for archiving.

But that’s the thing: Lightroom CC wants you to keep all of your photos in the cloud, and use its built-in sharing options to share images and albums with other people. If you continuously archive your work to an external drive and clear it from your catalog, you’re not going to buy upgraded cloud storage plans from Adobe. But I have no need to keep all of my images accessible in the cloud past their due dates. Once I deliver on a job, I’m out — archive, backup, delete. Lather, rinse, repeat.

You could buy several 2TB hard drives for the cost of a single year of the 2TB cloud plan.

To be sure, it’s not impossible to export and delete images from Lightroom CC. The program is just set up in a way as to make doing so less convenient than leaving them where they are. Aside from the annoyance of scrolling through old photos and albums I no longer actively need, this might not be a huge problem — except that cloud storage is also expensive.

Adobe offers several different pricing options, but the standard $10-per-month Photography Plan is arguably the best deal. With it, you get Photoshop, both versions of Lightroom, access to the (really quite cool) Spark mobile apps, and 20GB of cloud storage. It appears that new users can buy into a 1TB plan for $20 per month, but for whatever reason, I was able to upgrade to it for just $15 per month. This is a pretty good deal. A 1TB Dropbox Plus plan, for comparison, is $8.25 per month — and doesn’t come with, like, seven programs.

However, if I wanted to jump up to 2TB — which I would need to, if I wanted to keep even a majority of my photos in the cloud — the price leaps to $30 per month, which makes no sense whatsoever. I could buy several 2TB hard drives for the cost of a single year of the 2TB cloud plan. Call me old school, but given that I don’t have a need to access all of my photos all the time from any device — and that paying for that ability would be considerably more expensive than backing up files locally — there just doesn’t seem to be incentive to do it.

So why not stick with Lightroom Classic?

Here’s the thing: As much as I complain, the truth is I really like Lightroom CC. It is the modern Lightroom I’ve been waiting for Adobe to build for years. The user interface is beautiful and responsive, the editing and organization are more streamlined, and it’s not full of things I don’t need. It offers a much more enjoyable experience than Classic, and while I certainly don’t require cloud access to all of my photos, it is nice to not have to bring an external hard drive with me when I’m on the road.

What’s more, if a client makes a request for an image while I’m out with just my phone, I don’t have to wait until I’m home to deliver. I can pull it up in the iOS app, make any quick edits, and send it off right there before my latte even gets cold. You can even access your Lightroom CC library from any computer via the web app. Editing Nikon RAW files inside of Google Chrome feels a bit like magic.

Sure, Lightroom CC still puts annoying limitations on users — the lack of export options is particularly irksome — but none of those are enough to make me miss Lightroom Classic; not enough to go back, anyway.

My only other complaint is that I simply can’t afford to use Lightroom CC the way it was meant to be used, by storing all of my images in the cloud. Sure, I don’t have a real need for this, but for some people it would really be a simpler solution, and it’s just a shame that the cost of cloud storage will prevent people from using Lightroom to the best of its abilities. For now, the 1TB plan I can afford is sufficient to hold my current working projects, and gives me time to back up files locally before removing them from Lightroom.

It’s not a perfect solution, but what program is perfect? For me, Lightroom CC has reached the point where its imperfections are lesser than Lightroom Classic’s. But this isn’t about picking the less bad option; Lightroom CC looks and feels like the future, and I am hopeful that most of my lingering concerns will be addressed in updates down the road.

Of course, that still won’t help with the pitiful upload speeds that pass for broadband at American ISPs. But as more and more consumers begin relying on cloud services, perhaps the increased demand for faster uploads will push those ISPs in the right direction. At least, we can hope?

Editors’ Recommendations

  • If Lightroom is still slow for you, Adobe promises help is on the way
  • How to update your Gmail picture
  • With Mosaic, you can redesign your Instagram grid without the commitment
  • How to take photos of the moon
  • The best free photo-editing software


6
Mar

The PC port of Final Fantasy XV is gorgeous, if your hardware can handle it


Final Fantasy XV is a gorgeous game. The art design, the detail, every square inch of this game is a feast for the eyes — if you have the right hardware. Unlike its console counterpart, finding a balance between visual quality and performance is a bit of a chore, but we have a few tips that should help you get the most out of the game.

Testing conditions

First up, let’s talk about how we tested Final Fantasy XV. The system we used is a desktop PC with an AMD Ryzen Threadripper 1920X 12-core processor. That’s a lot of horsepower, and we’re aware that it’s pretty far outside the norm for all but the most high-end desktop PCs, but we use it for a reason. It’s among the quickest processors you can get your hands on, which means our results won’t be bottlenecked by a slow processor. That’s also why our testing PC has 32GB of RAM, and a lightning-fast 512GB SSD. We want to remove all the speedbumps that could trip up our results.

That means the results we discuss below will be almost entirely dependent on the graphics card, arguably the most important part of any desktop gaming PC. The graphics cards we tested fall into every price category up and down the scale, as we wanted to get a wide sample to find out how well the game does on even entry-level hardware.

On the Nvidia side, we tested the entire GTX lineup and two Ti-grade graphics cards: The GTX 1080 Ti, GTX 1070 Ti, GTX 1060, and GTX 1050.

The red team, AMD, has a slightly larger catalogue of current-gen graphics cards, so we picked four of our favorites — the RX Vega 64, RX 580, RX 570, and RX 550. For those of you keeping score, that’s two high-end cards, a mid-range card, and an entry-level card. It’s a nice sample from each tier of AMD’s graphics card lineup.

Presenting the presets

Every game’s preset settings differ, but it’s always helpful to take a look at what amounts to the developer’s suggested graphics settings. Like most games, FFXV offers four presets — Low, Average, High, and Highest. Here, there’s a clear difference at each level but pay extra-close attention to Noctis’ clothes.

FFXV’s bro-squad is the game’s heart and soul, so it’s no surprise they receive lavish detail. Turning up the game’s graphics make those extras obvious. Buttons, zippers, and other pieces of flair become obvious at High detail, and really stand out at Highest.

The realism of each character’s clothing also improves dramatically as detail is ramped up. At Low detail, the team’s leather jackets look fuzzy and lack fine detail, which makes them look like poorly built fakes purchased from eBay. Average is a big improvement, adding fine details that provide a better sense of depth and texture. At Highest, those details become crisp even when close to the camera.

FFXV’s bro-squad is the game’s heart and soul, so it’s no surprise they receive lavish detail.

Low and Average detail also suffer from a lack of shadow detail and depth which, combined with lower texture resolution, hurts the overall presentation’s contrast. Objects further from the camera, like the diner in the background, look flat and washed-out at Low and an Average. The High and Highest settings add shadow details that noticeably improve the game’s look.

Still, FFXIV tolerates its lower presets with grace, and some elements suffer little from downgrades. Each character’s face, flesh tones, and hair see less degradation then we might’ve expected, even at Low detail. The game’s strong art design helps, as both characters and environments have readily visible themes that don’t rely on fine details to express themselves.

Obviously, it’s best to play at the highest detail possible, but we think FFXIV looks great even at Average detail, and even Low is alright, though its texture resolution suffers. This, as you’ll soon see, means the game is enjoyable on a wide variety of hardware.

Let’s start with the best: 4K

Guess what? Final Fantasy XV looks great at 4K. Not a surprise, but it’s worth saying because this game is a sight to behold with every setting — resolution included — pushed to its absolute maximum. However, running the game at 4K requires some serious hardware. In our test rig, even our Nvidia GeForce GTX 1080 Ti had trouble keeping up with Final Fantasy XV’s intense and richly detailed visuals at 4K. Let’s look at the numbers

Right here, performance breaks along predictable lines. The 1080 Ti is in the lead by a sizable margin, while the 1070 Ti, and AMD Radeon RX Vega 64 achieve playable framerates, if only barely.

There are a couple things we can learn from these results right off the bat. First, look at the GTX 1050, and RX 550. Neither card can run this game in 4K. We could barely get in and out of the game because it ran so slowly on both cards. It was a slideshow. It’s worth mentioning because our GTX 1050 and RX 550 cards both had 2GB of video memory, and that’s definitely not enough to handle FFXV at this resolution.

Unsurprisingly, Final Fantasy XV looks especially great at 4K.

High-end cards like the RX Vega 64 and GTX 1080 Ti have a lot more than that — the Vega 64 has 8GB, and the 1080 Ti has a whopping 11GB. The performance gap between these sets of cards tells us that this game not only requires a lot of graphical horsepower, but also a sizable amount of video memory.

That’s why cards closer to the mid-range like the 1070 Ti and 1060 have an easier time running the game at 4K on Average or Low settings. They have enough video memory to handle those assets like textures and models, alongside the rendering horsepower to keep the game running smoothly.

In the end, if you’re looking to run FFXV at 4K, you’re going to want to stick with upper tier of graphics hardware. The lowest you’ll want to go is the GTX 1070 Ti or Radeon RX Vega 64.

1440p is far more forgiving

At 1440p, performance widens up a bit, and that’s good news if you don’t have a GTX 1080 Ti lying around. This resolution is a great middle ground. You can still enjoy crisp image quality, but you give yourself a ton of wiggle room to keep framerates high.

Look at those framerates! At 1440p Final Fantasy XV still looks great, and it’s easy enough on mid-range hardware that you won’t be stuck in a slideshow every time you draw your sword. Yes, your sword materializes in a shower of blue sparkles every time you draw it. It’s a whole thing.

Looking at our test data, the only cards that couldn’t run the game at 1440p were the GTX 1050 and RX 550. All the others were more than capable of maintaining an enjoyable framerate at Medium or High settings. Even the mid-range GTX 1060 maintained a playable framerate of 36 FPS at 1440p, at the Highest detail preset. The RX 570 had a bit of a tougher time, but plays the game just fine at High settings, and you won’t even notice the loss in detail.

1080p, the people’s resolution

Finally, we come to the most common resolution – 1080p. Here, almost every card in our test suite managed a playable framerate most of the time. Well, with one exception. Let’s get to the numbers.

Naturally, the GTX 1080 Ti, GTX 1070 Ti, RX Vega 64, and RX 580 had no problems here. They just walked right through to the Highest detail setting and asked, “Is that all you got?”

Even the mid-range cards, starting with the GTX 1060, did well enough. At the Highest detail preset, the GTX 1060 hit an average framerate of 49 FPS, and the RX 570 hit a just-barely-playable 29 FPS at the High preset.

The entry-level cards were a mixed bag, though. The GTX 1050 managed a comfortable framerate, 31 FPS, at the lowest detail setting. The RX 550 barely managed 24 FPS on average at the same detail setting. It’s a disappointing result for anyone with an RX 550. If you’re one of those people, try stepping your resolution down to 720p — you should be able to manage a decent framerate at that resolution.

Outperforming the presets

All right, let’s dig into the nuts and bolts of the graphics menu. Like most games, you’re going to find tons of options here. Each one controls a specific part of the game’s visual palette. On their own, most of these settings will only impact your performance a small amount, but taken together, they can make your game look great, or run poorly, depending on your hardware.

If you’re not comfortable tweaking some of these settings, we have some good news for you. It’s not worth it.

No, you read that right. This is our first performance guide where we would actually recommend you stick with the presets if you want the best performance and the best graphics your hardware can handle.

Each preset is so well-tuned, we couldn’t improve on them.

Square Enix really did their homework here. Each preset — Low, Average, High, and Highest — is custom-tuned to such a degree that we couldn’t actually improve on them despite our best efforts. We went through each setting and measured out precisely how big an impact each one had on overall performance.

In the end, we found that the built-in quality presets mitigate those performance hits better than anything we could put together. Still, there are a few settings you should keep an eye on if you find you’re hitting performance snags.

Riding the TRAM

TRAM, or Texture RAM, is one of the biggest performance hogs. Turned all the way up, we lost 11 percent of our overall framerate. This setting governs how much of your video memory is earmarked for game textures, which is why you’ll definitely notice when it’s turned up or down.

Look at how detailed every single piece of Noctis’ outfit is at the Highest TRAM setting. His buttons have little skulls on them, his t-shirt has skulls on it, even the many extraneous zippers on his outfit has an impeccable level of detail. Stepping down the quality to High, you lose a little detail, but the biggest hit comes at Average. Here, you lose almost all of the fine detail, and moving down to Low, you lose all but the barest suggestion of these details.

What we do in the Shadows

Second place, as usual, goes to the Shadows setting. Like TRAM, this one affects nearly every single frame of Final Fantasy XV, so it’s no wonder that when it was turned all the way down, we saw a 10 percent bump in performance.

Look at those friends, standing in our shadows, messing up our screenshots. They’re actually doing us a favor. Keep an eye on the edges of the shadows as you scroll through the screenshots. At the Highest setting, they’re sharp, detailed, and lush. At the Low setting, they’re fuzzy and amorphous — though they don’t look awful.

Bottom line

In the end, Final Fantasy XV will look great pretty much no matter what you do to it. Even at the lowest settings, it’s still a gorgeous game, and that’s a credit to stellar art direction and a richly detailed world. If you’re having trouble maintaining an acceptable framerate, try stepping down to the next lowest preset, chances are it’s going to offer better performance and visual quality than digging into the settings yourself.

It’s good the game’s art holds up at lower settings, because you may need to use them. That’s particularly true if you want to run at 60 frames per second. Falling back to 30 frames per second shows some mercy on your hardware.

We can certainly recommend FFXV if you want a graphical showcase, but be prepared. This game can challenge even the quickest PCs.

Editors’ Recommendations

  • The best graphics cards
  • Dell XPS 8930 review
  • ‘Far Cry 5’ on PC will support multi-GPU configurations, 4K at 60 fps
  • Dell Inspiron 15 7577 Gaming Review
  • Dell Inspiron 5675 gaming desktop review


6
Mar

Google Lens’ landmark, text recognition expands to all Android devices


Google Lens turns a smartphone camera into a keyboard, handling tasks from reading text to recognizing objects and landmarks — but those capabilities are expanding beyond Google Pixel Phones with expanded Android availability and a promise for an iOS launch, too. On Monday, March 5, Google announced expanded features for Google Lens inside Google Photos on Android that will allow the smart camera to add business cards and recognize landmarks. Google says that the tools are also coming soon to iOS users.

Google Lens could already recognize landmarks and save business cards, but before the latest Google Photos update, the feature was only available on Google Pixel phones. The latest update brings those features across Android devices.

With the update, taking a photo of a business card and heading into Lens will create a contact using the information from the card. The app uses text recognition to import the data from the image, saving users from manually typing out the information on the card.

The second new addition for non-Pixel Android users is the ability to recognize landmarks. Inside Lens the camera will identify popular landmarks, using, of course, Google search data to provide data like a description and even the hours of operation. Lens will also pull up the reviews on the location, and if that’s not enough, Google Search results on the landmark.

Google’s tweet announcing the new features also confirms a Mobile World Congress announcement that the smart camera mode will also be coming to Google Photos on iOS. Google hasn’t yet shared a timeline for when the update will be available in the App Store.

First available only on Google Pixel phones, Google Lens uses a smartphone camera (or existing photographs) to give the Google Assistant “eyes.” Machine learning on Lens allows the camera to identify objects and pull up related search data for when you just can’t think of the name of that flower or another object. The feature can also scan movie posters, barcodes, and book and album covers. Lens is also programmed to read text, which creates a shortcut to skip the typing by taking a photo of a link.

Android users can find the feature inside Google Photos after an app update, while on Pixel phones, the feature is also inside Google Assistant.

Editors’ Recommendations

  • How to use Google Lens to identify objects on your Pixel smartphone
  • Get ready for more AR apps — Google brings ARCore to version 1.0
  • Everything you need to know about Android 8.0 Oreo
  • Everything you need to know about the Google Pixel 2 and Pixel 2 XL
  • Insert Stormtroopers into your life with Google’s new AR stickers for Pixel