Skip to content

Archive for

5
Oct

Google Clips camera lays the groundwork for our AI-powered future


Allow me to make a bold prediction: Google’s Clips camera is going to flop.

Clips is a $250 camera powered by artificial intelligence and designed to snap images of important moments as they happen, no human input required. At best, it’ll probably sell OK at launch — there will be a handful of cute videos showing how the camera performs while attached to a dog or the top of a baby’s toy mobile, and the internet will briefly swoon. Maybe a few months later, it’ll catch a crime in action, and we’ll be reminded that these odd, all-observant cubes exist.

But, regardless of the viral content that comes out of Clips, it’s not going to be enough to convince mainstream consumers to run out and drop more than $200 on a clip-on camera. Smartphones have cameras (really good ones, even), and a lot of people have smartphones. Clips might address a real problem — freeing up users to experience life without worrying about filming it — but no one needs this technology right now. Besides, it’s kind of a creepy concept overall.

Allow me to make another, less bold claim: Google knows all of this. And while it would be great for the company’s bottom line (and its data-collection department) if Clips takes off, it doesn’t need the hardware to sell well right now. Google most likely has larger plans for Clips’ software.

Clips is a truly intriguing concept. As much as it evokes dystopian panic in the minds of anyone who’s read 1984, the camera actually solves a ubiquitous issue with current capture technology. Every time we document something with our smartphones, we construct a barrier between ourselves and the actual experience, fundamentally changing the way we remember that moment. Cameras alter our perception of reality. Ideally, Clips removes the need to manually take photos or videos, allowing people to exist in the moment, secure in the knowledge they’ll be able to upload the best bits to Instagram later.

This is the future of social media. Not the Clips camera itself, but the software inside of it. As wearables and smart-home devices become more commonplace, smartphones become less visible, and Google is preparing for a world where Glass actually looks good and sells well. Snapchat Spectacles proved people are into the idea of a camera attached to their glasses — but now imagine if that camera automatically captured the coolest moments of your day and lined them up to be shared across social media platforms whenever you wanted. It sounds like something out of Black Mirror, and it’s exactly what Google is building up to with Clips.

Google is banking big on AI across all of its products, and Clips is an ideal test of the public’s comfort with computer vision in their everyday lives. This isn’t the first clip-on camera designed to broadcast your day to the world — here’s looking at you, Narrative Clip — but it is the first to incorporate machine learning into the process. Combined with future wearables, this is a massive evolution of the lifelogging platform.

Ahead of today’s big Pixel 2 event, Google CEO Sundar Pichai told The Verge that his interest with Clips was not the camera itself, but its AI.

“I made a deliberate decision to name the hardware product with [a] software name,” Pichai said. “The reason we named it Clips is that the more exciting part of it is … the machine learning, the computer vision work we do underneath the scenes.”

Using machine learning, Google says the Clips software will improve over time as it recognizes what users consider to be interesting or important moments in their everyday lives. It makes sense for the company to test out this type of technology now, via a standalone, throwaway device. Surely, Clips will sell well enough to generate a bunch of data for Google, but the camera itself doesn’t give away the company’s plans for the future. Google can pivot, streamline and perfect its Clips AI before integrating it — as software, rather than hardware — into reliable wearables and in-home devices. It already has a software name, after all.

As for the creepiness factor, here’s where the 1984 paranoiacs might have it right. As demonstrated by the invasion of IoT devices in our homes and tracking apps on our phones, we’ll get used to the idea of cameras constantly ready to document our lives. Hell — we’ll most likely love it. Google just needs to convince us to like it first.

Follow all the latest news from Google’s Pixel 2 event here!

5
Oct

Pixel Buds hands-on: A better way to wear Google Assistant


Pixel Buds isn’t just the name of my Android-centric group chat anymore — they’re Google’s first attempt at a wireless headset, and they’re pretty smart, too. It’s not hard to look at them as the answer to Apple’s surprisingly popular AirPods, but they’re much more capable… as long as you don’t mind a cord dangling behind your head.

I’m told pairing with a Pixel is dead-simple — just open the charging case, power them up, and you’ll get a notification on the phone prompting a connection. (The units we tried were already connected, and Google didn’t love the idea of me redoing all that for myself.) Getting them in your ears is easy too, but nailing the fit takes a little extra work. Remember that cord I mentioned? You can pull it out near the buds to create a little loop meant to nestle into the crevices of your ears. Given that fully wireless earbuds have a tendency to disappear if they fall out of someone’s ears, I appreciate that Google thoughtfully used that cord to make sure the Buds don’t go anywhere they’re not supposed to.

The idea of a tiny touchpad sitting in your ear might seem off-putting, but operating the Pixel Buds is a breeze. Quick swipes forward and back control volume, a tap pauses whatever you’re listening to and… well, that’s mostly it. It would’ve been nice to have, say, a double-tap that changes the currently playing track, but that gesture is currently used for reading notifications when they land on your phone.

Here’s the best part: to activate Assistant, you just long-press the right bud and lift your finger when you’re done. Not only does that better isolate your command from the noise around you, it also removes the social awkwardness of having to say “OK, Google” to yourself in public. Thankfully, Assistant is very well-suited to sitting in your ear all the time — I asked it a couple random questions and got correct answers in mere moments, just like when I talk to my phone instead. Of course, Google’s built-in translation is a show-stealer, but we couldn’t really test that out. Something to save for the review units, I guess. Even so, it’s a very powerful differentiator for the Pixel Buds — if AirPods claim to fame is convenience, then the Pixel Buds’ is sheer usefulness.

Sadly, being in a corner of room slammed with hungry journalists made it difficult to get a sense of how they really sound. The music I piped through was perfectly audible and even pretty pleasant — even though the Buds don’t seal your ear canal the way in-ear buds do — but the nuances were pretty quickly drowned out. The bass came through nice and punchy, though. We’ll circle back on this when we get to test the Buds more thoroughly.

This isn’t the first time we’ve been able to wear Google Assistant — that came with smartwatches running Android Wear 2.0. It is, however, the first time wearing Google Assistant actually feels natural. I can count the number of times I’ve used Siri via AirPod on two hands, but I could actually imagine chatting with my Pixel Buds and getting value out of it. Here’s hoping the final versions are actually worth listening to — we’ll find out when they go on sale this November.

Follow all the latest news from Google’s Pixel 2 event here!

5
Oct

Google’s Pixel Buds translation will change the world


Google’s Pixel 2 event in San Francisco on Wednesday had a lot of stuff to show off and most of it was more of the same: the next iteration of the flagship smartphone, new Home speakers and various ways of entwining them more deeply into your smart home, a new laptop that’s basically a Yoga running ChromeOS and a body camera that I’m sure we’ve seen somewhere before. Yawn. We saw stuff like this last time and are sure to see more of it again at next year’s event.

But tucked into the tail end of the presentation, Google quietly revealed that it had changed the world with a pair of wireless headphones. Not to be outdone by Apple’s Air Pods and their wirelessly-charging TicTac storage case, Google packed its headphones with the power to translate between 40 languages, literally in real-time. The company has finally done what science fiction and countless Kickstarters have been promising us, but failing to deliver on, for years. This technology could fundamentally change how we communicate across the global community.

The Google Pixel Buds are wireless headphones designed for use with the company’s new Pixel 2 handset. Once you’ve paired the phones to the handset, you can simply tap the right earpiece and issue a command to Google Assistant on the Pixel 2. You can have it play music, give you directions, place a phone call and whatnot, you know, all the standards.

But if you tell it to “Help me speak Japanese” and then start speaking in English, the phone’s speakers will output your translated words as you speak them. The other party’s reply (presumably in Japanese because otherwise what exactly are you playing at?) will then play into your ear through the Pixel Buds. As Google’s onstage demonstration illustrated, there appeared to be virtually zero lag time during the translation, though we’ll have to see how well that performance holds up in the real world with wonky WiFi connections, background noise and crosstalk.

This is a momentous breakthrough, to say the least. Just 20 years ago, if you wanted to have a passage of text translated using the internet rather than tracking down someone that actually spoke the language, you likely did it through Altavista’s Babel Fish. Launched in 1997, it supported a dozen languages but often returned translations that were barely more intelligible than the text you put in. Over the next couple of decades, translation technology steadily improved but could never compete with natural language speakers for accuracy or speed.

In the last couple of years, we’ve seen some of the biggest names in technology jump into the translation space. In 2015 Skype debuted its Live Translation feature which works with four languages for spoken audio and 50 languages over IM. However the translations weren’t really in real-time, there was a lag between when the original message was sent and when the translated version arrived.

Earlier this year, Microsoft debuted its PowerPoint “Presentation Translator” add-in. Using an iOS or Android app, Presentation Translator can convert your voice over into Spanish or Chinese in real-time. It will not, however, make your PowerPoint presentation any less of an ordeal to sit through, so keep those slides to a minimum.

Both of those programs are impressive in their own rights, however, they’re a far cry from the hardware that Google has developed. Cramming all of the necessary bits and pieces necessary to facilitate real-time language translation into a device small enough to fit into your ear — especially without the need for external computing power — is no easy feat. That’s not to say that people haven’t tried (looking at you, Bragi Dash Pros).

The Pilot – Image: Waverly Labs

Take last year’s Indiegogo project darling, the Pilot from Waverly Labs. Reportedly leveraging “speech recognition, machine translation and the advances of wearable technology” these paired devices would be split between the people conversing and inserted into the ear. When one person speaks, the other earpiece automatically translates those words. Or at least that’s how it’s supposed to work. The crowdfunding campaign closed last year and deliveries have yet to begin, though the company states that it will begin shipping units in Fall 2017.

But there’s no need to do that now. Google didn’t just beat Waverly Labs to the punch, Google knocked them down with 25 additional languages (40 to the Pilot’s 15) and then stole their lunch money with a $160 pricetag — $140 less than what Waverly wants for the Pilot.

But this isn’t just about an industry titan curbstomping its startup competition, this technological advancement can, and likely will, have far reaching implications for the global community. It’s as close as we can get to a Douglas Adams-esque Babel Fish without having to genetically engineer one ourselves. With these devices in circulation, the barriers of communications simply fall away.

You’ll be able to walk up to nearly anybody in another country and be able to hold a fluid, natural language conversation without the need for pantomime and large hand gestures, or worry of offending with a mispronunciation. International commerce and communication could become as mundane as making a local phone call. The frictions of international diplomacy could be smoothed as well, ensuring that not only are a diplomats words faithfully translated but that a copy of the conversation is recorded as well.

Granted, this isn’t some magic bullet that will single handedly bring about world peace and harmony among all peoples. You’ll still have plenty of nonverbal and culturally insensitive means of putting your foot in your mouth but until we make like the Empire and develop Galactic Standard, Google’s Pixel Buds are our new best bet for understanding one another.

Follow all the latest news from Google’s Pixel 2 event here!

5
Oct

The Pixel 2 proves headphone jacks are truly doomed


As usual, Apple started a trend. Last year, it dropped the standard 3.5 millimeter headphone jack from the iPhone. The industry was quick to respond. Motorola, even before the iPhone 7 was announced, also removed the port from the Moto Z (though curiously, it remained on the cheaper Z Play). HTC followed suit with the U Ultra this year, as did the geek-friendly Essential phone. Now that Google’s Pixel 2 is confirmed to be headphone jack-less, it seems as if the port’s survival, at least in the mobile world, is a lost cause.

The truly sad thing? A year after this trend began, we still don’t have a good explanation of why we’re better off without headphone jacks. Removing the port opens up a bit of precious internal space, which allowed Apple to stuff in a bigger 3D Touch module in the iPhone 7 and 7 Plus. But did that actually help make 3D Touch more useful? And what have other phone makers gained, exactly, by jumping on this bandwagon? The additional room isn’t enough to significantly improve battery life, and aside from the Moto Z, it hasn’t led to an influx of ultra-thin designs either.

With the Pixel 2 and its larger companion, in particular, we’ve gained very little by losing the headphone jack. Sure, they’re much more water and dust resistant than the last models. But the Pixel 2’s IP67 certification is something several Android phones have offered for years — and they didn’t need to lose the port to achieve it. Typically when we move away from legacy hardware, we’re headed to something better. But in the case of the 3.5mm headphone port, the tech world seems to have forgotten that. Apple’s joking explanation — “courage” — isn’t enough.

59d539d15ab5352846c7b83c_o_U_v1.jpg

I’m not blind to the benefits of wireless. My trusty BeatsX earbuds are the first pair I’ve used that sound almost as good as great corded headphones. And I truly appreciate being able to use them on the subway without getting tangled up in cables. But here’s the thing: You don’t need to remove the 3.5mm port to enjoy the benefits of Bluetooth headphones. In fact, I’m running my BeatsX on an iPhone 6S — the last iPhone to include the 3.5mm jack. I just like having the flexibility to freely connect my phone to auxiliary cables in cars and corded headphones without carrying around any dongles. It’s 2017, that doesn’t seem like too much to ask.

And not to be too cynical, but it’s hard not to view the move away headphone jacks as a way for companies to push their own expensive wireless headphones. It’s no coincidence that Apple’s $150 AirPod’s debuted alongside the iPhone 7 and 7 Plus (as did the BeatsX). Today, Google also showed off its own offering, the aptly named Pixel Buds. It’s almost as if tech companies realize consumers would shell out a bit extra for wireless headphone, rather than live the dongle life.

dims?crop=2400%2C1600%2C0%2C0&quality=85

Chris Velazco/Engadget

As someone who’s chosen this hill to die on, the future looks bleak. Some manufacturers, like Samsung and LG, stuck with the 3.5mm port with their latest devices. Indeed, the the LG V30 appears to be the ideal new phone for audio fanatics, thanks to its powerful HiFi DAC. A headphone jack could just end up being a niche feature that some manufacturers use to entice geeks. But that doesn’t help iPhone users who want to upgrade this year, or Android fans who want the purest experience possible with Google’s Pixel phones.

It was easy for me to skip the iPhone 7 last year, as it was only a minor improvement over the 6S. But with the new design of the iPhone X, as well as its improved cameras, it’ll be hard for me to stay away. And even if I were to make the leap to Android, I’m just as tempted by the Pixel 2 as I am by the Galaxy S8. As much as I’d like to stick with the headphone jack, it’s only a matter of time until I’m tempted away. I just wish we had a good reason for moving away from the most widely supported port ever. No dongle will stop me from being resentful over that.

Follow all the latest news from Google’s Pixel 2 event here!

5
Oct

Porsche Mission E caught testing against Teslas


By Joel Stocksdale

It’s been about two years since Porsche revealed its slinky Mission E concept, which promised Tesla-matching range and performance with Porsche’s driving dynamic expertise. Now we finally get a look at one on the road. It looks like the Mission E is far along in development, and it seems that Porsche is very serious about taking on Tesla, since the car was being driven alongside both a Model S and a Model X.

Focusing on the Porsche itself, though, the design does appear to have been toned down significantly. The ultra-low hood and extra-tall fenders of the concept have been raised and lowered respectively for a much less dramatic nose. The bulging fenders have also been constrained a bit.

But the overall look still draws from the concept. The car is still very low in profile, with a low belt line. The rear fenders still look wide in trademark Porsche fashion, and they’re highlighted by the wide corporate taillight band on the back. Other details from the concept include the side vents on the fenders, the big diffuser at the tail, and the vents in the front bumper that descend from the headlights. Those headlights also look to be similar slim units to the concept’s. One other fun detail are the faux exhaust tips on the back to throw off spy photographers and passers-by.

Porsche has previously said that the Mission E would reach production by 2020, and according to our friends at Engadget, it should go on sale in 2019. Based on how complete the cars in these photos appear to be, we think the company has a good chance of hitting that target. When the concept was shown, Porsche promised 590 horsepower and, on the European test cycle, a range of over 310 miles. Also interesting was the concept’s claimed 800-volt electrical system that could be charged to 80 percent capacity in 15 minutes. Time will tell whether that system comes to fruition, but Porsche has at least tested some portion of the system on its Le Mans-winning 919 Hybrid race cars. Porsche also expects to sell the car for $80,000 to $90,000. All these features taken together would definitely make for a compelling Tesla alternative.

5
Oct

Renderings Imagine What an ‘iPhone X Plus’ Might Look Like


As Apple prepares to launch its first OLED iPhone with an edge-to-edge display, facial recognition, upgraded cameras, and other features, iDrop News has created renderings imagining what the future of the iPhone X might look like.

The renderings pair the existing 5.8-inch iPhone X with a larger model that has a 6.4-inch screen, based on the hypothesis that Apple is planning to release an all OLED iPhone lineup with devices that are similar in design to the iPhone X.

Design wise, the “iPhone X Plus” model in the rendering is identical to the iPhone X, with just a larger display to distinguish the two devices. It has the same notch-shaped top element to house the TrueDepth camera and sensors. Should Apple plan to introduce a larger version of the iPhone X in 2018 or beyond, it’s not clear what it would be named, but “iPhone X Plus” likely isn’t an option.


Though the iPhone X won’t be available for purchase for another three weeks, we’ve already been hearing rumors about Apple’s plans for 2018 and beyond.


Early information suggests Apple is aiming to introduce at least two OLED iPhones in 2018, with displays that measure in at 5.85 inches and 6.46 inches, similar to the renderings above. Apple is said to be working with Samsung Display and other suppliers to source OLED displays for the two devices.

Separate rumors have confirmed that Apple is aiming for an all OLED lineup for 2018 or 2019, with specific timing dependent on whether Apple can secure enough OLED production capacity from its various partners.

Apple was not able to introduce an all OLED lineup in 2017, instead pairing the $999 OLED iPhone X with the standard LCD iPhone 8 and 8 Plus, both of which have lower price tags.

Related Roundup: iPhone X
Discuss this article in our forums

MacRumors-All?d=6W8y8wAjSf4 MacRumors-All?d=qj6IDK7rITs

5
Oct

Apple Stops Signing iOS 10.3.3 and iOS 11.0, Downgrading No Longer Possible


Following the release of iOS 11.0.1 and iOS 11.0.2 on September 26 and October 3, respectively, Apple has stopped signing both iOS 10.3.3 and iOS 11.0, the previous versions of iOS that were available to consumers.

iPhone, iPad, and iPod touch owners who have upgraded to iOS 11.0.1, iOS 11.0.2, or iOS 11 will no longer be able to downgrade to the iOS 10.3.3 operating system.

Apple routinely stops signing older versions of software updates after new releases come out in order to encourage customers to keep their operating systems up to date.

iOS 11.0.1 and iOS 11.0.2 are now the only versions of iOS 11 that can be installed on iOS devices by the general public, but developers can download iOS 11.1, a future update that is being beta tested and will be released in the near future.
Discuss this article in our forums

MacRumors-All?d=6W8y8wAjSf4 MacRumors-All?d=qj6IDK7rITs

5
Oct

The Ohio State University Working With Apple on Digital Learning Initiative


The Ohio State University today announced that it has worked with Apple to create a comprehensive, university-wide digital learning experience that includes an iOS design laboratory and opportunities for students to learn coding skills.

Called the Digital Flagship University, the initiative will include an effort to integrate learning technology into the entire university experience. Along with the aforementioned iOS design lab, which will be available to faculty, staff, students, and members of the broader community, the university will aim to help students “enhance their career-readiness in the app economy.”

Apple CEO Tim Cook commented on the partnership, and said it will give students access to Apple’s new coding curriculum.

“At Apple, we believe technology has the power to transform the classroom and empower students to learn in new and exciting ways.

“This unique program will give students access to the incredible learning tools on iPad, as well as Apple’s new coding curriculum that teaches critical skills for jobs in some of the country’s fastest-growing sectors,” said Cook. “I’m thrilled the broader central Ohio community will also have access to coding opportunities through Ohio State’s new iOS Design Lab.”

Ohio State University’s Digital Flagship University will launch in the 2017-2018 academic year, with the design lab set to open in a temporary space in 2018 before moving to a more permanent location in 2019.

Starting in 2018, first-year students at the Columbus and regional campuses will be given an iPad Pro with Apple Pencil and Smart Keyboard, as well as apps, all funded through the university’s administrative efficiency program. Swift coding sessions will begin during the spring semester of 2018.

The iOS design lab will provide technological training and certification to students and community members who are interested in developing apps in Swift.

The Ohio State University also plans to integrate Apple technology into other areas of the university, introducing a chemistry course where students can complete assignments online with iTunes, debuting iPads for journalism and biology students, and more.

Tag: Swift
Discuss this article in our forums

MacRumors-All?d=6W8y8wAjSf4 MacRumors-All?d=qj6IDK7rITs

5
Oct

Snag a refurbished HP laptop EliteBook Folio 9470M for just $210 on Newegg


Get the perfect balance of portability and power with this Refurbished HP EliteBook Folio 9470M, which is currently over 90 percent off but only for a limited time. The laptop is HP’s first enterprise Ultrabook with docking capability, making it great for business elites who want to stay productive anywhere they go.

The enterprise laptop packs a third-generation Intel Core processor giving you ultimate control, security, and remote manageability. At just 0.75-inches thick, it’s the thinnest EliteBook to date, yet it gives you a 14-inch (355.6 mm) diagonal display providing all the mobility you need. It has VGA, DisplayPort, Ethernet, and three USB 3.0 ports so you can be more efficient wherever you take it.

We got our hands on an HP EliteBook Folio 9470M and found its slimness and number of ports to be huge pluses, along with the user-removable battery that doesn’t jet out and the pointstick and touchpad with discrete mouse buttons. We also confirmed that the laptop remains fairly quiet when not connected to the AC adapter and when hooked up with the fan running constantly, noise levels aren’t that noticeable even in quiet environments. We concluded that if ports and a removable battery are high on your list, you should definitely consider this HP model.

In testing, we found that under normal usage, you can probably get about seven and a half hours or more of battery life and if that’s not enough you can carry another battery and easily swap it out yourself. It has a generous touchpad and two sets of mouse buttons. Because you get distinct left and right mouse buttons, the design helps prevent issues with finger misplacement and accidental multitouch gesture activation. Additionally, if you’d rather use the pointstick for navigation, you’ll find it’s precise and easy to use.

You can keep the computer and your documents fully protected with a full set of EliteBook business software and anti-virus protection, which are automatically preloaded including HP Client Security and HP BIOS Vault. This model comes with the higher-end Windows 10 Pro 64-Bit operating system and 90-day limited parts and labor warranties.

The HP EliteBook Folio 9470M normally retails for $2,680 but for the next few days, you can score this refurbished model for $210 on Newegg, saving you $2,470 (92 percent).

Newegg

Looking for more great deals on tech and electronics? Check out our deals page to score some extra savings on our favorite gadgets.

We strive to help our readers find the best deals on quality products and services, and choose what we cover carefully and independently. If you find a better price for a product listed here, or want to suggest one of your own, email us at dealsteam@digitaltrends.com. Digital Trends may earn commission on products purchased through our links, which supports the work we do for our readers.



5
Oct

Researchers are measuring ocean health with drones, A.I., and whale snot


Why it matters to you

Gathering health data about ocean and Arctic creatures should help scientists better understand the impact of climate change.

How do you determine the health of the oceans and the Arctic? With some drones, artificial intelligence, and a bit of whale snot. On World Animal Day, Wednesday, October 4, Intel shared how its technology is being used as part of two successful wildlife research projects involving camera drones and artificial intelligence. The company recently partnered with a wildlife photographer, a conservationist, and two non-profit organizations to study the health of polar bears and whales, both animals that offer clues to the health of their ecosystems and the impact of climate change.

In the first study, wildlife photographer Ole Jorgen Liodden used Intel’s Falcon 8 drone system and a thermal camera to study the habits of a group of polar bears. With the drone’s aerial views, Liodden was able to monitor behavior like feeding, breeding, and migration. The data, Intel says, will help scientists understand how the animals are impacted by climate change, which offers a glimpse of the health of the arctic ecosystem as a whole.

Traditionally, researchers would have studied the bears’ movements using helicopters, a method that’s both expensive and invasive to the bears, or boats, a scenario that’s dangerous for researchers because of the harsh conditions.  Using the drone, the polar bears did not appear to be affected by its noise or appearance, even when the UAV was flown between 50 and 100 meters away.

“Polar bears are a symbol of the Arctic. They are strong, intelligent animals,” Liodden said. “If they become extinct, there will be challenges with our entire ecosystem. Drone technology can hopefully help us get ahead of these challenges to better understand our world and preserve the earth’s environment.”

The second research project is offering a better understanding of the health of the oceans by looking at, yes, whale snot. Accurately dubbed Project SnotBot, Intel partnered with Parley for the Oceans and Ocean Alliance to use artificial intelligence to monitor a whale’s health in real time. SnotBots first started studying the whales earlier this year.

After collecting the spout water from several different breeds of whales, including blue whales, right whales, gray whales, humpback whales and orcas, Intel’s AI algorithms analyze the sample for several different elements. The whale snot contains a wealth of different information, including stress and pregnancy hormones, viruses, bacteria, toxins, and DNA. The machine learning technology, Intel says, allows researchers to access the data in real time in order to make more timely decisions.