Like a real-life mech, MIT engineers use VR to put you in the head of a robot
Why it matters to you
This new telepresence tech lets humans carry out dangerous or unpleasant work remotely.
Researchers from the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Lab (CSAIL) have come up with a smart way to combine two of the most exciting emerging technologies — virtual reality (VR) and robotics — by creating a smart new telepresence control system. Using one of Rethink Robotics’ Baxter robots, a VR headset, and maybe a page or two from mecha anime like the classic Mobile Suit Gundam Wing, they created a smart head-mounted “sensor display” system which puts human operators in the head of the robot they are controlling.
“A lot of jobs are difficult to do remotely, particularly in manufacturing and industry,” Jeffrey Lipton, one of the postdoctoral researchers who worked on the project, told Digital Trends. “A system like this could one day allow humans to supervise robots from a distance. This would enable employees to work from home, and could even open up manufacturing jobs to people with physical limitations, [such as those] who can’t lift heavy or bulky objects. Many industrial jobs are also terrible for human health — imagine servicing the inside of an airplane or working out on an oil rig. They can be dangerous, cramped and uncomfortable, but right now they need a human mind to understand, make decisions, and do movements. We think this model of teleoperation could allow us to keep humans safe and away from these sites while leveraging human mental capabilities.”
Jason Dorfman/MIT CSAIL
MIT’s smart system embeds the user in a virtual reality control room with multiple sensor displays, allowing the user to see everything the robot is seeing at any given moment. To execute tasks, the human then employs gestures — picked up courtesy of hand controllers — which are mirrored by the robot. Controls accessible by the human user appear virtually, rather than being physical controls. This allows them to change depending on what the robot has to carry out at any given time.
“We hope to extend this work to many different robots and scale up the trials to tasks beyond assembly,” Lipton continued, describing future research the CSAIL scientists hope to carry out. For more on the Baxter project, you can check out a research paper published earlier this year, titled “Baxter’s Homunculus: Virtual Reality Spaces for Teleoperation in Manufacturing.”
The Google Pixel 2 camera already earned the highest scores yet — here’s why
Its camera was one of the many highlights of the original Google Pixel, but with machine learning and updated hardware, Google is taking its photo game up a notch with a number of firsts in the new Google Pixel 2 and Pixel 2 XL. Announced on October 4 during an event in San Francisco, the Google Pixel 2 camera sports background blur without a second camera, dual optical and electronic image stabilization, automatic HDR and augmented reality stickers inside the native camera mode. Oh, and with a rating of 98, the Pixel 2 also earns the highest rating for any smartphone camera yet from DxOMark.
The updates stem from Google’s focus on mixing machine learning, hardware, and software across its entire line of new products. Machine learning is responsible for a number of the Pixel 2’s new features. First, the Google Pixel 2 includes a portrait mode that, like the last few iPhone Plus phones, simulates background blur that more closely resembles the photos shot from a DSLR, rather than a small sensor smartphone. The difference? Pixel 2 doesn’t need two cameras to do it.
Google / YouTube
Instead of creating a 3D depth map by comparing the different views from the two different locations, Pixel 2 uses a dual pixel sensor, comparing the right and left sides of the pixel to determine what objects are close to the camera and what is in the background. Machine learning software then applies the blur effect to the background. According to Google’s Aparna Chennapragada, the system is very accurate, even with busy backgrounds or when photographing an object instead of a person.
Besides the fact that the one-camera requirement means the feature is available without moving to the larger phablet-sized XL, the portrait mode is also available with the front-facing camera in the Pixel 2.
Computational photography is also responsible for the Pixel 2’s boost in photo quality using an automatic HDR. When a photo is shot on the Pixel 2, the camera takes several images at slightly different exposure levels, then aligns them via software on a pixel-for-pixel basis. With the data from multiple files, the resulting image contains more detail in the lightest and darkest areas of the image. Chennapragada says that all of this happens instantly, with no shutter lag between shots as the smartphone processes the files.
Both hardware and computational photography are also responsible for the Pixel 2’s boost in low light stills and video. Dubbed Fused Video Stabilization, the Pixel 2 essentially uses both the optical stabilization, the hardware-type system that doesn’t create a loss in image quality, and electronic stabilization, which measures any remaining shake and crops the footage to steady the shot.
Google / YouTube
Besides capturing the real world, the Google Pixel 2 includes built-in augmented reality stickers. While AR stickers are growing in popularity in apps like Snapchat and Facebook Camera, the Pixel 2 is the first mainstream smartphone to our knowledge to include AR inside the native camera mode instead of a third-party app. The AR stickers, which range from emojis to characters from movies including Stranger Things and Star Wars, aren’t just placed in the scene but move around. Place two stickers, and they’ll even interact with each other.
Google says that factors like emotions, physics, and even the lighting in the scene all blend together to make the virtual characters feel like part of the scene. “Now you can be the director of all kinds of stories and share these with friends,” Chennapragada said. “…The stickers are so easy to use and they look great because the Pixel camera is specially calibrated for AR with pixel tracking and 60 fps. The camera can bridge the physical and digital with AR.”
The Pixel 2 also features a motion photos mode, a shooting style that captures three seconds of video behind the scenes.
The Pixel 2’s camera specs are rounded out with a 12.1-megapixel sensor paired with an f/1.8 lens and an updated, dual pixel autofocus system. The front-facing camera uses an eight-megapixel sensor.
All those features factor into the Pixel 2’s DxO score of 98, the highest score yet from the third-party camera testing company for any smartphone. Last month, DxO Mark updated its testing for smartphones to include factors like the artificial background blur from portrait mode. While the updated guidelines suggest that the Pixel 2 has one of the best cameras for smartphones, the guidelines don’t directly compare it under the rest of DxOMark’s tests, so the Pixel 2’s score of 98 doesn’t compare to the same 98 earned by the full frame Sony a7R II.
The Pixel 2’s camera will also come into play for more than just shooting photos — a preview of Google Lens, an artificially intelligent computer-vision program, is built into the Google Photos and Google Assistant native apps. The feature allows users to snap a photo of a flyer and copy and paste an email address or phone number without retyping, and to discover more about art, movies, books and music by snapping a photo.
All those features suggest users will be taking a lot of photos with the Pixel 2 — but the included Google Photos app also now gives Pixel 2 owners free, unlimited photo storage. The app also uses an automatic tagging system, making all those photos easy to search through.
Pricing for the Google Pixel 2 starts at $649, with the XL starting at $849.
Google Pixel 2 XL vs. Apple iPhone X: There can be only one
Google’s latest hopeful in the battle for smartphone supremacy is the heavyweight Pixel 2 XL. With a stunning, 6-inch AMOLED screen, slim bezels, and a solid set of internal specs, the new flag-bearer for pure Android is sure to attract many admirers. Google has signaled its intention to continue competing in the smartphone hardware business with its recent $1.1 billion acquisition of part of HTC’s smartphone team, but this follow-up to the Pixel XL has a new maker in LG.
Coming near the end of the year, the Pixel 2 XL is entering a crowded field, but no other device casts as long a shadow as Apple’s forthcoming iPhone X. It sports a 5.8-inch OLED screen — Apple’s first — which almost completely fills the front of the phone. It also boasts the lightning-fast A11 Bionic processor, a dual camera, wireless charging, and the new FaceID, which compensates for the lack of a fingerprint sensor.
If you are shopping for a new smartphone right now, and only the best will do, these two handsets deserve a place on your shortlist. But which is better? We need more hands-on time to say for sure but for now, let us compare the specs of the Pixel 2 XL with the iPhone X to see which smartphone comes out on top.
Specs
iPhone X
Pixel 2 XL
Size
143.6 x 70.9 x 7.7 mm (5.65 x 2.79 x 0.30 inches)
157.5 x 76.2 x 7.6 mm (6.20 x 3.00 x 0.30 inches)
Weight
174 grams (6.14 ounces)
175 grams (6.17 ounces)
Screen
5.8-inch Super Retina OLED display
6-inch P-OLED display
Resolution
2,436 x 1,125 pixels (458 ppi)
2,880 x 1,440 pixels (538 ppi)
OS
iOS 11
Android 8.0
Storage
64GB, 256GB
64GB, 128GB
MicroSD card slot
No
No
NFC support
Yes (Apple Pay only)
Yes
Processor
A11 Bionic with 64-bit architecture, M11 motion co-processor
Snapdragon 835, with Adreno 540
RAM
3GB
4GB
Connectivity
4G LTE, GSM, CDMA, HSPA+, 802.11a/b/g/n/ac Wi-Fi
GSM, CDMA, HSPA, EVDO, LTE, 802.11a/b/g/n/ac Wi-Fi
Camera
Dual 12 MP rear, 7 MP FaceTime HD front
12.3 MP rear, 8 MP HD front
Video
Up to 4K at 60fps, 1080p at 240fps
Up to 4K at 30fps, 1080p at 120fps, 720p at 240fps
Bluetooth
Yes, version 5.0
Yes, version 5.0
Fingerprint sensor
No, has Face ID instead
Yes
Other sensors
Accelerometer, gyro, proximity, compass, barometer
Gyroscope, accelerometer, compass, proximity sensor, barometer
Water resistant
Yes, IP67 rated
Yes, IP67 rated
Battery
2,716mAh
21 hours of talk time, 13 hours of internet, 14 hours of video playback, and up to 60 hours of audio playback
Fast-charging — 50 percent charge in 30 minutes, wireless charging (Qi standard)
3,520mAh
Fast-charging
Charging port
Lightning
USB-C
Marketplace
Apple App Store
Google Play Store
Colors
Space Gray, Silver
Black, Black and White
Availability
AT&T, Verizon, T-Mobile, Apple
TBA
Price
$999
$849
DT review
Hands-on review
Coming soon
These are two of the fastest and most powerful phones you can find, but there is a significant gap on the performance front. Apple’s custom-designed A11 Bionic chip packs six cores — two for high performance and four for high efficiency — as well as a graphics processing unit that is 30 percent faster than the one in last year’s iPhone 7. That enormous processing power is put to good use with cutting-edge augmented reality support, and the new FaceID technology, which allows you to unlock your iPhone simply by holding it up to your face.
By contrast, the Pixel 2 XL has Qualcomm’s octa-core Snapdragon 835 processor paired with an Adreno 540 GPU. We know this combo well because it is featured in a range of the top Android smartphones over the last few months, from Samsung’s Galaxy S8 to the HTC U11. Google’s new phone is likely to squeeze the best performance from this hardware, because it runs stock Android, with no embellishments, but it’s not going to keep up with the iPhone X. Last year’s A10 chip, which Apple employed in the iPhone 7 and 7 Plus, beat the Snapdragon 835 in most benchmarks and the new A11 is a lot faster.
The fact that the Pixel XL 2 has 4GB of RAM to the iPhone X’s 3GB of RAM, is not going to improve things for Google’s phone because Android and iOS handle memory management in very different ways and so the iPhone simply doesn’t need as much RAM for comparable performance.
Both phones offer 64GB in the basic model, or you can pay more to jump up to 256GB with the iPhone X, or 128GB with the Pixel 2 XL. Neither offers a MicroSD card slot for expansion.
We don’t think you’ll really feel the difference in everyday use now — both these phones are plenty fast and powerful enough — but there’s no doubt that the iPhone X has more raw power and that will likely tell over time as these handsets age.
Winner: iPhone X
Display, design, and durability
Google’s Pixel 2 XL is a big phone by any measure, but the manufacturer, LG, had some practice shrinking bezels and fitting gorgeous OLED screens, most recently in the LG V30. The two-tone design on the aluminum back, which we saw in the original Pixel phones, has been retained in the Pixel 2 XL, but this time the signature glass portion at the top has shrunk in size and the fingerprint sensor sits below it. The main camera is at the top left corner, flanked by the flash. Flipping it back over, there is a power button on the right spine, highlighted in orange, with a volume rocker slightly further down.
The 6-inch display dominates the front of the phone and it is one of the sharpest displays we have ever seen with a resolution of 2,880 x 1,440 pixels, giving it a pixel-per-inch score of 538. Thin bezels top and bottom house the front-facing stereo speakers, microphone, front-facing camera, and a few other sensors. The Pixel 2 XL is significantly taller and a touch wider than the iPhone X, but they weigh about the same.
The iPhone X marks a real change of direction for Apple by embracing the bezel-less trend and adopting OLED screen technology. While the notch that eats into the top of the display has garnered some negative reactions, most people agree that the iPhone X looks stunning and futuristic. A svelte, steel frame gives way to a glass back, which allows the iPhone X to support wireless charging. The dual camera module is stacked vertically, unlike the 8 Plus. Overall, it’s a beautiful device that reflects Apple’s minimalist approach to design.
The notch is a necessary evil, as it houses the front-facing camera, FaceID tech, speaker, microphone, and some other sensors. There is no room for a fingerprint sensor, but the almost all-screen front allows Apple to pack a 5.8-inch screen into a relatively compact body. The OLED screen is the sharpest Apple has ever put in a smartphone, with a resolution of 2,436 x 1,125 pixels, giving it a pixel-per-inch score of 458.
On the durability front, Apple’s iPhone X and Google’s Pixel 2 XL boast an IP67 rating, which means they can be dunked at a depth of up to one meter for 30 minutes without sustaining any damage. Neither of these phones has a 3.5mm audio jack, so you’re stuck with adapters or Bluetooth 5.
While the Pixel 2 XL has a slightly bigger and sharper display, the iPhone X is smaller and more stylish. The design of the iPhone X is a real leap for Apple, while the Pixel 2 XL feels like an evolutionary step for Google. The iPhone X edges for the win in this one.
Winner: iPhone X
Battery life and charging
The Pixel 2 XL is packing a sizable battery rated at 3,520 mAh. That is a touch bigger than last year’s Pixel XL, which we found fairly average in the battery department. We expect the Pixel 2 XL to last at least a full day between charges, maybe longer. The iPhone X has a much smaller, 2,716 mAh battery. It has a slightly smaller screen to power and there may be efficiency benefits from that A11 chip, but we fully expect it to be in the daily charging category. We’ll have to test them out in the wild to be sure, but you’d expect to get more battery life from the Pixel 2 XL based on the specs.
As for charging, the iPhone X can charge 50 percent in 30 minutes. The Pixel 2 XL supports fast charging, too, with a 15-minute charge providing up to 7 hours of battery life.
The iPhone X also supports wireless charging via the Qi standard, which is very handy if you have wireless phone chargers. Sadly, the Pixel 2 XL doesn’t support wireless charging, but it still wins this category for the larger battery capacity.
Winner: Pixel 2 XL
Cameras
Google continues to ignore the dual camera trend with the Pixel 2 XL, which has a single camera sensor rated at 12.2 megapixels with an f/1.8 aperture. The original Pixel proved that a well-designed single lens camera with clever software can go toe-to-toe with any dual-lens setup, and the hardware in the Pixel 2 XL has been improved. DxOMark has scored it at 98, the highest score of any smartphone camera so far. For comparison, the second place phones right now are the iPhone 8 Plus and Galaxy Note 8 on 94. Google added optical image stabilization and made some software tweaks to improve the overall experience. The Pixel 2 XL also offers a portrait photo mode, on both the front-facing and rear cameras, like the iPhone X, though it works differently. The fact that Google provides free, unlimited cloud storage for all your photos and videos in high resolution is a valuable extra.
Apple’s iPhone X camera sports dual 12-megapixel sensors, combining wide-angle and telephoto lenses, with f/1.8 and f/2.4 apertures, respectively. Both have optical image stabilization, to help prevent blurry shots. You can take unique portrait shots and tweak the light source with the Portrait Lighting feature. There is also a powerful bokeh effect, to blur backgrounds, and 2x optical zoom. The front-facing camera boasts Apple’s True Depth system, employing an infrared camera that works together with the 7-megapixel front-facing camera to securely recognize your face and unlock your phone. It can also be used to animate emojis with your own facial expressions, something Apple calls Animoji.
On the video front, Apple’s camera can shoot 4K at 60 frames per second and 1080p at 240 fps. By contrast, the Pixel 2 XL tops out at 30 fps for 4K and 120 fps for 1080p footage.
We want some hands-on time with these cameras before we declare an overall winner.
Winner: Tie
Software
Both Android and iOS are polished, feature-packed platforms that have much more in common than divides them. You can all the major apps and games on both. Familiarity is probably going to dictate your preference. Google’s Pixel 2 XL runs the latest Android 8.0 Oreo out of the box and will get updates just as soon as they come out, which is more than you can say for a lot of other Android flagships. This is Google’s vision of how Android should be, and it’s every bit as slick and stylish as Apple’s iOS. The iPhone X also runs the latest version of Apple’s platform, iOS 11, and it will get updates as soon as they’re released for the foreseeable future.
The iPhone X has a handful of nifty features connected to the front-facing cameras, which we have already discussed. But the Pixel 2 XL offers Active Edge, which allows you to squeeze your phone to launch Google Assistant or snap a selfie, like HTC’s Edge Sense on the U11. Both Google and Apple are highlighting augmented reality, with a range of fun games that play out on top of your coffee table, and handy apps that can help you preview furniture in your living room. The best ARKit apps are well worth a look for iPhone owners, but Google is doing something similar with ARCore. Google also has Daydream VR, whereas Apple has yet to jump on board with virtual reality.
Both platforms are great. We can’t award a winner in this category.
Winner: Tie
Pricing and availability
Neither of these phones is available yet. The iPhone X is available for pre-order from October 27 and will begin shipping on November 3. The Pixel 2 XL is available for pre-order now and should start shipping on October 17.
The iPhone X starts at a whopping $1,000 for the 64GB model, while the 256GB variety will cost you an extra $150. It will also be available on contract from all the major carriers, including Verizon, AT&T, T-Mobile, and Sprint and you can expect to pay close to $50 per month if you opt for the 64GB model on a 24-month contract.
The Pixel 2 XL starts at $850 and Google is offering a Home Mini speaker, which usually costs $50, for free with every purchase for a limited time. We’re not sure about carriers just yet.
Winner: Pixel 2 XL
Overall winner: Apple iPhone X
If the iPhone X price tag gives you pause, or you prefer Android, then the Pixel 2 XL may prove irresistible. It has a gorgeous, big display, an excellent camera, and some slick software extras. Google’s latest and greatest looks fantastic and we can’t wait to spend more time with it, but Apple’s iPhone X edges the overall win. It’s more powerful, it has a more daring design, and we’re excited about the dual cameras on the back and FaceID on the front. These may be the two best smartphones to launch this year, so you can’t really go wrong. It’s a very close run thing, but the iPhone X is our winner.
Google taking ARCore to next level with AR Stickers, more on Pixel 2, Pixel 2 XL
Why it matters to you
Google’s ARCore platform needs nothing more than a few sensors and a single camera.
Google launched ARCore, an augmented reality development platform for Android, in August. It’s available in preview on the Google Pixel, Pixel XL, and Samsung Galaxy S8 ahead of a launch on phones from LG, Asus, and Huawei. But that’s just the start. At an October 4 event in San Francisco, Google announced a slew of ARCore updates pegged for the holiday season.
One of the highlights is AR Stickers, a Google-designed app launching in preview alongside the Pixel 2 and Pixel 2 XL. Using ARCore’s environment-scanning algorithms, the app overlays digital decorations on surfaces like floors, tables, walls, and chairs. Google demoed a few inspired by the hit Netflix series Stranger Things at the event in San Francisco, and said it’s teaming up with brands like Disney, the NBA, NBC, Netflix, and others for the first few sticker packs.
The other new ARCore apps are less exclusive — Google says they’ll work on “most” Android devices running 7.0 Nougat or newer. A new League of Legends app lets spectators watch matches unfold in their living rooms. An updated Lego app puts hundreds of digital kits and pieces at your fingertips. (Google showed a Lego astronaut zooming around in a spaceship.) And a shopping app from Houzz lets you pick, personalize, customize, and “place” furniture before you order it.
The Pixel 2 and Pixel 2 XL are uniquely optimized for ARCore, Clay Bavor, Google’s vice president of AR and VR, said. Thanks to high-fidelity gyroscopes and accelerometers, top-end cameras, and powerful image processors, they’re able to run most ARCore apps at a consistent 60 frames per second — a smoother image than you’re likely to get on less powerful devices.
Google said that ARCore focuses on three critical elements — motion tracking, environmental understanding, and lighting. It determines a device’s position and orientation in space by anchoring onto specific “landmarks” — furniture, for example — in a room, and adjusts for factors like ambient lighting. It’s not unlike Project Tango, the Google-designed depth-sensing platform built into LG’s Phab 2 Pro and Asus’s ZenFone AR. But Google sees the two technologies as complementary.
“We think of Tango more and more as an enabling technology — it’s akin to GPS, where you don’t see devices or apps branded as having this technical capability,” Google told us. “[The public] may not see consumer-branded Tango devices moving forward .”
The new ARCore-enabled apps will be available from Google’s AR Experiments showcase and the Google Play Store when they launch later this year.
Google taking ARCore to next level with AR Stickers, more on Pixel 2, Pixel 2 XL
Why it matters to you
Google’s ARCore platform needs nothing more than a few sensors and a single camera.
Google launched ARCore, an augmented reality development platform for Android, in August. It’s available in preview on the Google Pixel, Pixel XL, and Samsung Galaxy S8 ahead of a launch on phones from LG, Asus, and Huawei. But that’s just the start. At an October 4 event in San Francisco, Google announced a slew of ARCore updates pegged for the holiday season.
One of the highlights is AR Stickers, a Google-designed app launching in preview alongside the Pixel 2 and Pixel 2 XL. Using ARCore’s environment-scanning algorithms, the app overlays digital decorations on surfaces like floors, tables, walls, and chairs. Google demoed a few inspired by the hit Netflix series Stranger Things at the event in San Francisco, and said it’s teaming up with brands like Disney, the NBA, NBC, Netflix, and others for the first few sticker packs.
The other new ARCore apps are less exclusive — Google says they’ll work on “most” Android devices running 7.0 Nougat or newer. A new League of Legends app lets spectators watch matches unfold in their living rooms. An updated Lego app puts hundreds of digital kits and pieces at your fingertips. (Google showed a Lego astronaut zooming around in a spaceship.) And a shopping app from Houzz lets you pick, personalize, customize, and “place” furniture before you order it.
The Pixel 2 and Pixel 2 XL are uniquely optimized for ARCore, Clay Bavor, Google’s vice president of AR and VR, said. Thanks to high-fidelity gyroscopes and accelerometers, top-end cameras, and powerful image processors, they’re able to run most ARCore apps at a consistent 60 frames per second — a smoother image than you’re likely to get on less powerful devices.
Google said that ARCore focuses on three critical elements — motion tracking, environmental understanding, and lighting. It determines a device’s position and orientation in space by anchoring onto specific “landmarks” — furniture, for example — in a room, and adjusts for factors like ambient lighting. It’s not unlike Project Tango, the Google-designed depth-sensing platform built into LG’s Phab 2 Pro and Asus’s ZenFone AR. But Google sees the two technologies as complementary.
“We think of Tango more and more as an enabling technology — it’s akin to GPS, where you don’t see devices or apps branded as having this technical capability,” Google told us. “[The public] may not see consumer-branded Tango devices moving forward .”
The new ARCore-enabled apps will be available from Google’s AR Experiments showcase and the Google Play Store when they launch later this year.
Google Lens ships with the Pixel 2 — here is what’s so cool about the program
Why it matters to you
Google Lens is still learning, but a few features will be available first to owners of Google’s newest smartphones.
Unveiled earlier in 2017 at Google I/O, the first public version of the artificially intelligent computer vision program Google Lens is now part of the new Google Pixel 2. During Wednesday’s October 4 event in San Fransisco, Google shared a preview of Lens that will ship inside the new Pixel 2 smartphone, with integration into both Google Photos and Google Assistant.
Google Lens is the tech giant’s computer vision software that collects information from a photograph to either save some time by skipping the typing or to learn something new about the things that we see around us. The tool effectively mixes Google search with a camera, and while the Pixel 2 only contains a preview of the feature, the platform already creates a few promising shortcuts.
During the event, Google’s Aparna Chennapragada shared how the new feature allows the smartphone’s camera to be used as a sort of keyboard. When taking a photo of something with text, like a flyer, Google Lens allows users to highlight text such as email addresses, phone numbers, websites, and street addresses and copy the information. The shortcut makes it easy to look up a location on Google Maps or call a phone number without typing it into the keyboard.
Besides serving as a visual shortcut to typing in long and unusual email addresses, Google Lens is also designed to help users understand the objects they see — starting with art and entertainment. Snapping a photo of a piece of art will lead to who the artist is and what else they painted. See a movie poster? Lens will tell you if the flick is worth watching or not. Snapping photos of album covers and book covers also lead you to more details on the work.
The preview inside the Pixel 2 is just a start for the computer vision software. When the software was first announced, Google listed a long number of possibilities, including translating text, getting more details on a business, reading Wi-Fi network settings or learning the name of that flower you just spotted.
Google’s computer vision also works with existing photos, powering a number of tools inside the native Google Photos app on the Pixel 2. Searching for specific objects, people and even famous landmarks is possible through the program’s auto-tagging feature.
Google Lens is based on machine learning — Google essentially used those millions of photos in their search results to train the computer to recognize what a specific object looks like. With enough photos, the program can learn to recognize what the Eiffel Tower looks like on a cloudy day, lit up at night or even blurred from camera shake to correctly identify what is in the photo.
Chennapragada said that Google Lens will continue to improve with use. For example, she said, Google’s voice recognition, at first, wouldn’t always recognize speech correctly, particularly with factors like accents. Now, after several years of development, Google voice has a 95 percent accuracy rate.
Google CEO Sundra Pichai said that the object recognition AI built by Google had a 39 percent accuracy rate. Using what’s called AutoML, which is essentially artificial intelligence building more AI programs, that accuracy rate has improved to 43 percent and is continuing to improve.
“This is why we are excited about the shift from mobile first to AI first, it’s radically rethinking how computers work,” Pinchai said during the presentation. “Computers should adapt to how people live their life, rather than people adapting to computers.”
Google Lens will first be available in Pixel 2 by tapping the Lens icon inside both Google Photos and Google Assistant.
Creepy ‘parasitic’ machine grabs your arm, forces you to generate power for it
Why it matters to you
Conceptual art project challenges our assumptions that machines are always here for our benefit.
In the past, we’ve written about smart devices and fabrics which harness some aspect of the user’s body in order to extract energy. A new project created by researchers at the Human Computer Interaction Lab at Germany’s Hasso Plattner Institute (HPI) builds on that idea — through the creation of a “parasitic” machine that requires human-generated energy in order to function, but which gives absolutely nothing back to the human user in return. Picture the robot equivalent of a mosquito, and you won’t be a million miles off.
“This is an art piece made by five researchers in human computer interaction: [myself], Alexandra Ion, David Lindbauer, Robert Kovacs and Professor Patrick Baudisch,” Pedro Lopes, one of the creators, told Digital Trends. “The purpose is to stimulate the viewer or visitor. The project itself is a reversal of our own work in HCI where all our prototype technologies, much like in our everyday life, includes a human [coming out] on top of machines. We just wanted to let visitors try out how it feels if a ‘machine is on top.’”
The device is housed in a rectangular acrylic tube, which features a crank mechanism and seat at each end. The crank-style levers invite you to grab them but, when you do, a pair of electrode cuffs lock your arm in place, and then stimulate your wrist muscles with small electric shocks to make you involuntarily crank the lever. Doing this generates kinetic energy, which keeps the machine going. The process is only ended (well, for you at least) when another person comes along and cranks the opposite lever — thereby taking over as the host. The machine’s name? “Ad Infinitum.”
Lopes says that the project grew out of five years of serious work on electrical muscle stimulation, and what exactly this could be used to achieve. When it comes to “Ad Infinitum” he makes clear that the work is “100 percent” conceptual, although it certainly provokes plenty of interesting questions about the relationship between humans and the technology they use — and the symbiotic, occasionally parasitic, link between the two. (You can argue similar power dynamics when it comes to the use of data and how we willingly make smarter the machines that may one day replace us.)
So far, “Ad Infinitum” has been shown off at events including the Science Gallery Dublin’s “Humans Need Not To Apply” exhibition in Ireland, and “The Practice of Art and Science” exhibition at Ars Electronica in Linz, Austria.
With the Pixel 2, Google is doubling down on the single camera
The Google Pixel 2 has one camera. And that’s actually a really big deal.
There are a lot of interesting aspects to the Google Pixel 2 and Pixel 2 XL, but one of the most interesting is also the least controversial: Google is sticking with a single rear camera.

Every major phone manufacturer, from Apple to Samsung to LG and Huawei, has transitioned over the last years to a flagship with a dual camera setup. While the implementation varies between handsets — a second monochrome sensor, or a wide-angle lens, or telephoto/portrait abilities — the strategy is the same: augment the primary shooter with additional functionality in an attempt to stand out from the quickly-maturing field of competitors, and in turn sell more phones.
Google wants people to take great photos every time, and it’s using its expertise in software to make sure that happens.
With the Pixel 2 and Pixel 2 XL, Google is doing the exact opposite. It is doubling down on the single camera, and investing heavily in software-based solutions to augment the 12MP sensor’s natural abilities. Sure, both the Pixel 2 and Pixel 2 XL benefit from new physical hardware, in this case the addition of optical image stabilization as well as a wider, faster f/1.8 lens, but any portrait effects, digital zoom noise reduction, or tightly-stitched panoramas are all done using Google’s increasingly powerful, and incredibly impressive, suite of software tools marketed under the guise of “computational photography.”
As Google showed with the first-generation Pixel’s HDR+ mode, computational photography has real-world advantages. Sure, most manufacturers, from Apple to Samsung, engage software to influence the output of photos to some extent, but Google’s strategy is to completely mitigate the disadvantages of only one sensor — indeed, to pour all of its resources into that one digital pathway — through lines of code. And while HDR+ has existed in the Nexus line as far as 2013’s Nexus 5, it wasn’t until 2016 with the Pixels that the hardware speed caught up to the software’s ambition. Back in 2014, with the Nexus 6, Phil Nickinson wrote this about the oversized phone’s camera:
Again, I’ve gotten some really good low-light shots. And I’ve gotten some really bad ones.
Google’s HDR+ mode helps with that some, bringing a little better balance. But it also exposes our chief gripe with the camera app. It’s just slow. It takes more than a few beats to launch from a cold start, and even worse if you don’t manage to actually launch it on the first try from the lock screen shortcut. And you can take an HDR+ shot then have to wait a good 5 or 10 seconds for it to finish processing before you can tell if you need to take another one.
That frustrating wait time became less significant in 2015 with the Nexus 6P, and considerably more tolerable with the Pixels. Today, when you aim and shoot with a Pixel or Pixel XL, it’s safe to leave HDR+ on all the time, since processing is practically instantaneous. And the processing abilities are profoundly better; HDR+ improves smoothes the skin tones of a portrait, captures the vibrancy of a sunny day, exposes properly a delicate sunset, and brings out detail in low light.
Part of the reason the Pixel does so much better at this stuff than the Nexus 6P before it is thanks to Google’s stringent oversight of the hardware and software; though the company does not build its own phones, it spent a lot of time working with its hardware partners HTC and Sony to perfectly tune the camera for its software.

Now that the Pixel 2 is here, you’ll be able to use a Portrait Mode that blurs out the background. But, like HDR+ did with low-light photos, you only need a single lens to do it.
Google Pixel 2 and Pixel 2 XL
- Google Pixel 2 + Pixel 2 XL are official
- Google Pixel 2 specs
- Our 2016 Pixel reviews
- Join our Pixel 2 forums
Stranger Things and ‘FoodMoji’ live in the Pixel 2 camera thanks to AR Stickers
Watching Eleven take out the Demogorgon never gets old.

With a couple of taps, the mighty Demogorgon can be summoned through your Pixel 2 to make a scary short film. When you’re bored with this monster, all you have to do is call upon everyone’s favorite frozen waffle-eating monster slayer to send it back to the Upside Down.
Sound like fun? This is the tip of a particularly impressive feature Google has built straight into the Pixel 2 camera app. It’s called AR Stickers, and the whole goal is to make it ridiculously easy to have fun with augmented reality.

AR Stickers is a lot more than a licensing deal with Netflix so you can take photos with a Demogorgon, although it is also that. And you can expect other big names to follow, like Star Wars and the NBA and Saturday Night Live. It’s an interactive special effects lab which really shows off just how deep into the AR game Google is right now. These “stickers” are entirely interactive creatures and people and props. Some will turn to look at you when you walk around them, and can even turn to face other humans in the camera shot to interact as though it’s really there. Of course, they also interact with each other in all kinds of fun ways.
The most impressive part of all of this is how well these AR Stickers work.
Google’s plans for AR Stickers include a constantly updating array of options. You’ll see seasonal offerings when appropriate, pop culture tie-ins for things like Stranger Things, and some home grown content made by the folks at Google. One of these is called “FoodMoji” and it’s not particularly difficult to guess what happens here. You’ll be presented with options like a sleepy cup of coffee or an excited cheeseburger to interact with, enhancing your photos in new and cute ways.
The most impressive part of all of this is how well these AR Stickers work. Instead of a separate app with a separate plugin for the camera, Google baked this right in. That means you’re already using the best possible camera app for your phone to make these AR creations come to life, and it means no additional installation or set up is required. This is seamless AR, and it’s clear Google thinks a lot of people are going to be very excited about playing with these creations.
For now, AR Stickers are going to be exclusive to the Pixel 2 and Pixel 2 XL. The feature also won’t be available at launch, but will be a part of an update in the not-too-distant future. This app is part of Google’s careful calibration of these phones for augmented reality, and so it’s going to especially shine on these phones. There are no current plans to make this feature available elsewhere, but as the idea grows in popularity it’s possible we’ll see these AR Stickers show up everywhere.
Everything new about the updated Daydream View
A dozen little changes make a big difference in the updated Daydream View headset.

It may only look like a subtle color change from the outside, but Google’s VR team has put a ton of effort into improving the Daydream View headset. This new version was designed to really take advantage of the Pixel 2 XL, but like all things Daydream you can use any phone certified to work with Google’s VR platform.
I spent a few minutes with the new Daydream View headset, and here’s a quick peek at everything Google has improved.
Goodbye light leak

My biggest complaint about the original Daydream View is how much light comes in from the sides of the headset. If you have a bright light behind you, the VR experience becomes quickly compromised and pulls you out of immersion.
This has been largely fixed by the new padding design, and surprisingly the headset is still very glasses-friendly. The padding comes much closer to the sides of the head, but not close enough to squeeze against your face. This means there could still be some light leak depending on how your face fits in the mask, but it’s going to be dramatically reduced if you notice it at all.
Hello heat sink

The pad where you place your phone before closing it into the headset is now much grippier and less rigid plastic. This is because there are now magnesium heat pipes running from the top of this hinged section to the bottom. This heat sink even works if your phone is in a case, though obviously not quite as well as if the phone were directly touching the surface.
According to Google, this heat sink allows the Pixel 2 XL to run complex Daydream apps with no heat issues whatsoever, and will stop most Daydream-enabled phones from overheating even after hours of gameplay.
New controller storage
The flat surface created by the heat sink means there’s no elastic carrier for your Daydream Controller, so Google moved it elsewhere. You can now find an elastic tether at the back straps, which will hold the controller in place when you’re not using it.
While this does expose the controller in a way that was previously impossible, the placement makes getting set up to play something in Daydream a little easier.
Improved head strap
Not only is the controller on the back of the head strap now, but the design has been adjusted to allow for a top strap for added security. This helps distribute the weight of the headset, but if you decide the top strap isn’t your favorite you can easily remove it and keep your hair from being messed with.
Better fabrics and padding
The original Daydream View was mostly fabric, and Google has doubled down on that experience with this new design. The outer fabrics feel a little sturdier, so there’s no concern for tearing like the predecessor. The inner materials are noticeably softer and more plush, which means the headset itself is going to me more comfortable on your face.
Perhaps more important, Google’s new fabric choices make the headset noticeably more comfortable to wear.
Huge custom lenses

The days of circular lenses just like the ones you would see in a Google Cardboard headset have come to an end. Google went all in and have customized hybrid fresnel lenses for Daydream View. This means the “sweet spot” for everything looking crystal clear is larger, and the field of view has been noticeably improved.
These lenses have an irregular shape and you may notice some quirky rings in the design, but as soon as you put the headset on all of that melts away to a much improved viewing experience.
New Colors

Crimson and Snow are out, sadly. Google’s new outer fabrics called for an updated color palette, and Google went with dark grey, slightly less grey, and somewhat faded orange. Sorry, I got confused. I meant to say Charcoal, Fog, and Coral.
Even more exciting than new colors is that this year the controllers will match! This means you get a bright happy Coral Daydream Controller to go with your headset, making it much easier to find should you misplace it.
Coming soon to a Pixel 2 near you
Google plans to make this new Daydream View available for $99 alongside the Pixel 2 Pre-Orders starting today, and will be shipping on October 19th.



