Skip to content

Archive for

22
Jun

How to Stop WhatsApp Auto-Saving Images and Video to Your iPhone’s Camera Roll


Some 60 billion messages are sent over the WhatsApp chat platform every day. One of the reasons for the service’s massive popularity is that it lets users send and receive as many media-rich messages as they want, which – as long as they’ve limited WhatsApp’s use of their cellular data plan – costs them next to nothing.

That’s great news for senders, but one of the drawbacks of receiving multiple images and video clips from your WhatsApp contacts is that they’re automatically saved to your iPhone’s Photo Library. Apart from being an unwelcome sight in your personal Camera Roll, they can start to take up valuable storage space. Fortunately, you can easily prevent this default behavior by following the steps below.

How to Stop WhatsApp Saving to Your Camera Roll

Launch the WhatsApp app on your iPhone.

Tap the Settings icon (the small cog wheel) at the bottom right of the screen.

Tap Chat Settings.

Toggle off the Save Incoming Media option so that it no longer shows as green.

Once you’ve turned off the above setting in WhatsApp, you can still manually opt to save individual media files that you receive in a chat thread. If your iPhone supports 3D Touch, simply hard press on the photo or video clip in question and swipe up to reveal the Save option. Alternatively, you can access the Save option by tapping the photo or clip and selecting the Share icon in the lower left of the screen.

How to Limit WhatsApp Media Downloads to Wi-Fi

If receiving images or video over WhatsApp is sapping your cellular data when you’re out and about, you can prevent them from downloading to your iPhone until you’re safely back in range of a Wi-Fi connection.


To do so, return to WhatsApp’s Settings tab and select Data and Storage Usage. The options under Media Auto-Download let you dictate which types of media can be downloaded and under what circumstances. Make sure that the media types you’re happy to wait for are set to Wi-Fi.

Tag: WhatsApp
Discuss this article in our forums

MacRumors-All?d=6W8y8wAjSf4 MacRumors-All?d=qj6IDK7rITs

22
Jun

Microsoft Updates Bing iOS App With AI-Powered Visual Search


Microsoft has announced a new visual search feature for its Bing app that lets users snap a picture with their phone’s camera and use it to search the web.

The new visual search function builds on the AI-powered intelligent search capabilities already used by Bing, and works pretty much like Google Lens: Users take a photo of something or upload one from their camera roll, and then the search engine identifies the object in question and offers additional information by providing links to explore.

The feature appears as an icon in the Bing app’s search bar, and can be used to search for everything from landmarks to breeds of dog, but Microsoft is pushing it as a way to shop from photos for clothes and home furniture:

Let’s say you see a friend’s jacket you like, but don’t know its brand or where to purchase. Upload a pic into the app’s search box and Bing will return visually-similar jackets, prices, and details for where to purchase.


Visual Search is available today in the U.S. via the Bing app [Direct Link]. Microsoft says the feature will also roll out soon for Microsoft Edge on iOS as well as Bing.com, which remains a search engine option in Apple’s latest version of Safari browser.

Tags: Microsoft, Bing
Discuss this article in our forums

MacRumors-All?d=6W8y8wAjSf4 MacRumors-All?d=qj6IDK7rITs

22
Jun

Laptop vs Desktop: Which One Should You Get?



Source: https://www.pexels.com/photo/business-computer-connection-contemporary-450035/

When you’ve already made the decision to buy a computer, you find yourself being met with another important choice to make: laptop or desktop? It can be a little confusing to decide which one is the better fit. It all comes down to choosing which hardware best meets your needs.

There are some key considerations you need to decide on when you’re shopping for a laptop or a desktop. Whatever it is you are picking, either one has their own set of advantages and disadvantages. Check out a brief summary of the two below:

Laptops

Since their invention, laptops have been touted as a portable computer you can carry everywhere you go. It has its own rechargeable battery that can power the device on its own for a few hours which can be useful if you are constantly on the move and you need to work on something while in transit. In order to be portable, they are smaller than most desktop computers, take less space, and less heavy.

Laptop computers have come a long way in the last decade that some have internal components like processors and GPU that can rival the midrange desktop computers of today. The downside is, decent gaming laptops tend to be on the more expensive side. However, their power is still limited because of the way they are built to be compact. High-functioning components need a lot of space and its size is a disadvantage when it comes to heat dissipation so while heavy-duty gaming is possible, it will need an external cooling system or run the risk of overheating and permanent damage.

Furthermore, upgrades can be a bit of a drag with these devices since components are built-in or not removable and if there are, it is only the memory and hard drive that can be taken out. This is another disadvantage when trying to replace malfunctioning or broken parts because there is little to no availability for the parts. If you find yourself charting this path in the near future, check first if it is cheaper to replace your laptop instead of upgrading it because the latter may end up more costly. The screen size is dependent on the size of the laptop but it can be connected to an external display if needed.

When it comes to ease of use, laptops have this on lock. They require little to no installation when brought out of the box so you can just charge it for a specific amount of time then turn it on and then begin to use it. When it comes to accessorizing, laptops are straightforward because you only need a laptop sleep or a hard casing for the device to protect it from bumps and scratches.

Desktops

Desktop computers dwarf laptops in size and unlike the latter, they are bulky and come in many parts. They are designed to stay in one place like in the home or office and because they don’t get much mileage, they are less prone to being damaged in transit. It needs to be plugged in all the time in order for you to use it so while it may not be portable, it is reliable and ergonomic considering it has a big keyboard and proper screen.

They have a large range of prices because you can buy them pre-built or customize them to suit your needs. Since size is not a problem, desktop computers can accommodate a lot more components including the most advanced processors, GPU, motherboard, and many more. Overheating is largely preventable in desktop PCs because the CPU case can accommodate air cooling fans or even a liquid cooling system.

The beauty of building a PC of your own is that you can easily control your budget or oversee which components you can spend more on because there are many budget-friendly alternatives out in the market. Another advantage is having the option of customizing your desktop to have certain specs for work or gaming because this is not available for laptops.

Upgrade and replacement are a breeze since you can take apart parts and replace them easily. A downside is connecting everything together and while it may not require rocket science, it still is extra work needed to make it operational.

The Bottom Line

It all comes down to what you need and prefer. No matter what you choose, it should be able to answer your needs and convenience. With so many available laptops and desktop computers out in the market today, either offer a range of affordable options and you should have no problem finding them.

Are you a laptop or desktop person? Share your perspective in the comment section below.

22
Jun

Processed pies: Silicon Valley’s Zume Pizza ready to offer you dinner made by a robot


Even the food in Silicon Valley is getting high tech. No, we’re not talking about meal-replacement shakes or coffee with butter in it — we’re talking good, old-fashioned pizza. Well, not exactly old-fashioned — the pies from Zume Pizza are made by robots. To help it reach new heights, the company is upgrading its existing robots by adding arms into the mix.

It is the latest in a new trend within the food industry  — and a number of industries globally —  to increasingly depend upon machines rather than human labor. In this case, robots are not just taking your order or clearing your tables — they are the chef, too.

In order to help these machine chefs better do their jobs, Zume is giving them ABB robotic arms, which are capable of pulling pizzas from 800-degree ovens, and placing them on shelves. So efficient are these appendages that they can fill a whole rack in just 4.5 minutes.

“We’re going to eliminate boring, repetitive, dangerous jobs, and we’re going to free up people to do things that are higher value,” co-founder Alex Garden, a former Microsoft manager and president of mobile game maker Zynga Studios, told the Associated Press in 2016. Seeing that a stat from Cintas claims that restaurants jobs are responsible for up to 33 percent of occupational burns, Zume really could be saving humans several trips to the hospital.

Already, Zume’s robots are taking care of spreading sauce on pizza dough when it comes down a conveyor belt and sliding the raw masterpieces into a hot oven. While humans are still dealing with prepping the dough and ensuring the right amount of cheese and toppings make it onto every pie, robots will soon be in charge of that aspect as well.

“We automate those repetitive tasks so that we can spend more money on higher quality ingredients,” said Julia Collins, Zume’s CEO and co-founder. “There will always be a model here at Zume where robots and humans work together to create delicious food.” So don’t worry — robots won’t be kicking you out of the restaurant industry altogether, yet.

That said, robots seem to be taking the place of humans at an increasingly alarming rate. Wal-Mart cut 7,000 jobs due to automation and both hardware manufacturer and fast-food chain Wendy’s made similar changes to their workforce. The restaurant industry has not always had luck replacing people with machine parts — in fact, a Chinese restaurant chain had to shut down a couple locations due to poor service from its robotic waitstaff.

But who knows — maybe machines will have better luck with Italian cuisine.

Updated on June 21: Zume adds a robotic arm to help with its automated pizza-making process. 

Editors’ Recommendations

  • Counting down the 10 most important robots in history
  • In Boston’s newest restaurant, all the chefs are robots
  • Walmart says that store employees have embraced new robots
  • Robot chefs are the focus of new Sony and Carnegie Mellon research
  • Nestlé, XPO Logistics building ‘digital warehouse’ with robotics, autonomous vehicles



22
Jun

YouTube boosts creators with channel subs, merchandise stores, and premieres


YouTube is offering its creators more ways to rake in the cash.

The video-streaming site announced several new money-making possibilities on Thursday at the annual VidCon gathering in Anaheim, California.

First up is Channel Memberships where creators can offer bonus items to their followers for a $5 monthly fee.

The perks could include anything from “exclusive livestreams, extra videos, or shout-outs” to “unique badges, new emoji, members-only posts in the Community tab,” as well as unique custom perks offered by creators, YouTube said in a blog post announcing the new features.

Up to now, this Channel Memberships have been available as Sponsorships for YouTube Gaming channels, but it’s now coming to all eligible YouTube channels with more than 100,000 subscribers.

The company is also expanding opportunities for creators to sell merchandise (see video above) through their channels by teaming up with Teespring, which offers customization options on more than 20 merchandise items that include logo-emblazoned shirts, and, if you really must, phone cases “with a creator’s face.”

The merchandise feature is available from today to all eligible U.S.-based channels with more than 10,000 subscribers, with YouTube aiming to introduce additional merchandising partners in the near future.

And there’s more. YouTube is launching Premieres (above), which enables creators to debut pre-recorded videos as a live moment.

“When creators choose to release a Premiere, we’ll automatically create a public landing page to build anticipation and hype up new content,” the blog post explains. When fans hit the page to watch the new video, they’ll be able to chat with each other, as well as with the creator, in a live chat before the video begins. “It’s as if a creator’s entire community is in one theater together watching their latest upload,” YouTube says.

YouTube’s latest initiatives for revenue generation follow recent controversy on the site that’s seen some creators complaining about what they see as unfair demonetization of their videos. It’s also a response to efforts by Facebook to attract YouTube creators to its platform, with more pressure coming in recent days with the launch of IGTV, a new long-form video feature from Facebook-owned Instagram.

YouTube is in the middle of a busy patch just now. Earlier this week the Google-owned company launched a revamped version of YouTube Music and introduced YouTube Premium subscription services.

YouTube streamed its very first video in 2005 and now has more than 1.9 billion monthly users across 90 countries and 80 languages.

Editors’ Recommendations

  • Social Feed: Pinterest glams up, Facebook tests ‘live’ prerecorded premieres
  • YouTube Music is replacing Google Play Music: Here’s where, when, and why
  • New trivia game shows and video polls on Facebook turn viewers into participants
  • The best livestreaming TV services: PlayStation Vue, Hulu, Sling TV, and more
  • The best iPhone apps available right now (June 2018)



22
Jun

Google Measure app now works on any ARCore-capable phone


Apple has been embracing augmented reality on the iPhone, and recently introduced a new feature to iOS 12 that allows users to measure real-world objects using AR on their phone. Now, Google is expanding its version of the feature. The Google Measure app is being expanded to work on any ARCore-capable phone. Previously, it was only available on Project Tango devices that had specialized sensors.

In other words, dozens of Android phones can now take advantage of augmented reality measuring capabilities on their phone, including the likes of the Google Pixel 2, Samsung Galaxy S9, and more. Even an original Google Pixel XL will support the new app.

When you first open up the app, which is Measure 2.0, it’ll walk you through how to use it, but we found it generally easy overall to use the app — when it worked properly. When we installed the app, it seemed a little buggy at times, though moving the phone around the room fixed most of those issues, as it allowed the app to better scope out the environment.

So what about accuracy? When used properly, the app did pretty well. It managed to correctly measure a desk that was six feet wide, but it measured a ledge on the desk, which is 11 inches, as 10 inches. The app itself warns that its measurements should be correct plus or minus one inch, so it does deliver on its promise — though if you’re looking for something ultra-precise, then you may want to stick to that trusty tape measure.

The app also allows you to take a photo — so you can save the measurements you take without having to continuously remeasure. That’s a pretty handy feature, especially for those that might need to take their measurements to a furniture store, for example. In the settings, you can also switch between metric or imperial units — so it should be usable in countries around the world.

Of course, the app will likely continue to get better over time as Google applies machine learning and its object-recognition technology continues to get better. Either way, if you want to get the Google Measure app for yourself, visit the website.

Editors’ Recommendations

  • How to use Apple’s augmented reality Measure app in iOS 12
  • Carpenters, plumbers, and other pros can ‘beam’ into your home with Streem
  • With developer program, Google Photos is about to become a lot more versatile
  • Google Expeditions go beyond the classroom with AR Tours for Android and iOS
  • The best cycling apps



22
Jun

The Glitch Mob spent a year making a VR experience as otherwordly as their music


Daniel Johnson

Watching electronic dance music (EDM) group The Glitch Mob perform would seem to be the perfect greeting after being abducted by aliens.

The group’s current stage design is a space-inspired work of art known as The Blade 2.0, and the trio of musicians perform in glowing pods with thick Dell touchscreens at their fingertips, like Star Trek pilots. But the elaborate design isn’t only about aesthetics. The touchscreens are there for the group to precisely re-create their complex music live, and the pods are partly so the crowd isn’t staring at the back of laptops for an hour.

“The technology is just there for us to be able to perform our music.”

“The important thing, really, is that the technology gets out of the way,” Glitch Mob member Justin Boreta told Digital Trends. “The technology is just there for us to be able to perform our music.”

Boreta spoke with Digital Trends after Glitch Mob brought The Blade 2.0 to this year’s Governors Ball Music Festival in New York City. He discussed how the band’s latest album entered virtual reality, the group’s influences as “children of music technology,” and why they think Windows-based hardware is better than Apple’s.

Digital Trends: Explain to me exactly what this spaceship-looking stage design called The Blade 2.0 is and who came up with the idea? How is it an improvement on 1.0?

Justin Boreta: I’ll give you a brief bit of context here. We all used to be DJs. The way that we came together as a group was that we were individual DJ’s, and we decided to play at the same time. So we started by connecting our computers and having a sort of DJ jam-out. Then at some point, we decided to make original music, and then it continued from there. Collaboration has always been in the DNA of what we do.

At one point, we wanted to figure out a way to perform electronic music but not with instruments and keyboards, and stuff like that. We wanted to actually play music like a rock band, and have the live performance aspect on stage. So we started taking these touchscreen controllers at the time called Lemur, which are not even around anymore. This was right around the time iPads came out, or right before iPads. We would tilt them toward the crowd and remove the laptop from the equation. So when you’re up there DJing, and you’re looking at a laptop, it’s hard to really connect with people. So, we tilted these touchscreens toward the crowd and started triggering all of our sounds off Ableton, which is the software that we use to write and perform it.

A stage design isn’t just about function, it’s also about form. What were the compromises you made to make sure The Blade 2.0 not only performed well but carried a certain aesthetic?

An interesting thing about doing a show like this is that it has to get taken down and put up every single night. So, there’s a special skill set of our crew, team, and the designers that have to know how to fit in all these different stages right. We played Bonaroo last week in front of 30,000 people, or something like that, on this massive stage. Then sometimes we’re traveling around the country playing very small stages where we have to shrink down. So the whole thing can really telescope out and become larger. There’s a lot of compromise that happens there, just spatially. I mean, if we had all the space in the world to play with, we would do a lot more stuff.

Simon Bonneau

EDM, unlike most other forms of music, has its progression tied largely to the technology used. What have been some technologies that didn’t exist, or weren’t as popular in 2010, when you debuted with your ‘Drink The Sea’ album, that have come out in the years since?

Yeah, absolutely. I think that it’s part of our ethos, to use technology to get up there and perform. It’s an important part of what we do because we are children of music technology. Everything that we do is pushing the boundaries of the computer. We’re always using the fastest computers and always breaking them. [Laughs]. We have 20 controllers plugged into one laptop. Just what we can do right now, on stage with The Blade, wouldn’t have been possible even a couple of years ago. In the very beginning, it was totally different.

“What we can do right now, on stage with The Blade, wouldn’t have been possible even a couple of years ago.”

So, mainly, the software we use for everything is based off of Ableton, Ableton Live. We have a laptop running this massive session with all of our songs in there. We have a documentary about the craziness of it. But, there’s really a lot that’s going on behind the scenes to make everything feel reliable so that everything doesn’t crash. But really, what we do wouldn’t even have been possible a couple of years ago, or without all of the sort of music tech stuff. … The important thing really is that the technology gets out of the way and the touchscreen turns into something else. The technology is just there for us to be able to perform our music.

Your new album, ‘See Without Eyes,’ has such a wide variety of sounds meshed together. In some cases, I’ve never heard anything like them before. What are some of your favorites, and were there any weird or accidental ways you’ve discovered a new sound?

Absolutely. The way that we work, there’s a lot of found sounds that go in there into the music. There’s a lot of custom sounds. There’s field recording from our life that go into there. There’s also a lot of experimentation and happy accidents. So, some of the songs started off at one tempo, and they changed to be another. Or we recorded vocals for one song, and then use them for another.

So for instance, the vocals on Take Me With You were actually recorded for a different song on the album. We have all this material that we treat like samples, almost as if they’re samples we’ve taken off of vinyl, but they really are vocals that we recorded. Then we take it, chop it up, mix it, recontextualize it, and put it in there. So, there’s a lot of that sort of mosaic work that goes into the record, and there’s tons of sound design that is really just there to add another layer of narrative and a cinematic texture to everything.

You’re also putting out a VR experience in support of your new album. What are the benefits of VR for musicians and how long did it take to create your experience? Who did you work with?

We collaborated with a company called The Wave, Dell, and Alienware to create this VR experience. VR’s really exciting, even though it’s something that is just all happening right now, and the technology is getting there. But there’s some really crazy stuff that’s possible. This is an entirely new way to experience the album. It’s a music video times a million.

This is an entirely new way to experience the album. It’s a music video times a million.

You get to get inside of the music and interact with it, and you’re on that trip through this four-song — I think it’s four songs right now — journey. It’s an entirely new way to experience music because you actually get to go into a literal narrative through it. I mean, I heard it in an entirely new way, and we have a lot of visual cues that we use to create the album art, to create all of the videos. We collaborated with The Wave and this guy named Strangeloop and his studio, who we’ve collaborated with a lot, because they understand the ethos and the DNA behind the project. They were able to create a whole world around it, easily.

How long did it take? How involved were you guys in the final design?

Took about a year. It’s actually interesting because we were very, very involved in the whole process. We didn’t do any of the actual programming ourselves, but we’re getting constant contact, texting, calling, and going to the studio all day to look at stuff. But if you go on YouTube right now, you can see that we have a video for every single song in a sort of visual accompaniment for the album. So, Strangeloops Studios created that using game engine software used for VR. … He created the video mainly in Unity and Cinema 4D, then he went and made the VR version of it because he had created the world using gaming engine software. So, already it was easy to turn it into a VR experience.

This is your first album in four years. Why such a long gap in between albums? How has the group changed?

Well, it took us about a year to a year and a half to write the album, and we had an EP come out in 2015, mid-2015. So basically, we were just on tour. When we released the album, we did a two-year tour cycle. Pretty much all 2015, we were on tour. Then 2016, we started writing the album … . So, it’s really sort of like pausing the whole tour life and coming back to really do the work it takes to write a full album like this and everything around it.

With such complex music, it must be common for songs to go through numerous iterations before they’re released. For the new album, what was the most difficult song to complete in terms of how many different versions you had and the hardest to put together?

How Could This Be Wrong, I think was the one that had so many versions. It was up into the hundreds. [Laughs]. We had actually got rid of that song and deleted it, and then at one point, we thought, “Wait a second. We want a song that has this particular vibe.” So, we brought it back, and we had met Tula, who is the vocalist on there.  She was also on the Keep On Breathing track. She did vocals on Keep On Breathing, which we loved.

So, we took the song out of its grave, sent it to her, and she sent the vocals over, and we actually revived it. So, that was the last addition, but it took a lot of wrangling to change everything, and take it from where it was to a completely different song.

See Without Eyes is your second consecutive album to hit No. 1 on the Dance/Electronic Album chart after your first few records failed to reach the Top 5. How much of your recent success do you attribute to new distribution technologies like streaming?

I mean, clearly, I think that a lot of that has to do with the fact that the algorithms that measure what becomes No. 1 change over time. To be quite honest, we don’t stay too focused on that. I mean it’s nice that we have a No. 1 song and a No. 1 album, and the fact that really our fans are out there listening to music over and over again. We’re an independent label, we don’t have a major label behind it, we did it with our own machinery here. We’re pretty DIY. It’s a great sort of hat tip to the music and the fact that we try to make music that’s classic and timeless, and that people will grow with.

The Blade 1.0 was used in promotion of your 2014 album Love Death Immortality, and 2.0 is for the most recent album. Will you continue this trend for your next album? And if so, what do you think it’ll look like?

It’s an interesting progression because, as technology gets stronger and there’s the ability to create more immersive shows, we’ll be constantly finding new ways to tell a story with music. So, who knows? But we have some more music in the works. We have some cool stuff that’s going to be coming out soon. I think for the fall tour, we might make some updates. It might be a Blade 2.1 or maybe a 2.5. I can’t say just yet, but there’s going to be some fun updates to it. We’re all trying to figure out new ways to rock out more on stage.

Go back in time for me a bit. What was the first gadget you fell in love with?

“We use the most powerful Mac Pro, completely maxed out next to the most maxed out Alienware and it’s quite a bit more than twice as fast.”

The first gadget I ever fell in love with? Wow, that’s a really good question. You know, it would have to be my very first computer, which was an Apple IIGS, very early Macintosh. My grandfather bought it for me when I was super-young. I was like 5 years old or something like that. [Laughs]. I spent a lot of time playing games and learning how to write computer code and I always had a computer from there on out. I’m not a classically trained musician, in any way. None of us really are. We all just came into it from tinkering.

Then after that, I had a PC with Fruity Loops in high school. I would spend all of my time learning how to make jungle and drum and bass. So for me, I come into music through technology. I’m completely a student of the tech first and then sort of a music theory [student] second.

What is the current tech obsession or fascination you have right now?

I think the biggest thing is the move we made over to Windows machines, which is funny because, in the music world, people typically write on Macintosh. But … we really need the most horsepower possible, and there’s a lot of graphics and graphics cards we have in these Alienware machines. We used the most powerful Mac Pro, completely maxed out next to the most maxed out Alienware, and it’s quite a bit more than twice as fast.

So, for us, that just means we can play more music. I think that it’s going to be a really exciting time for Windows and Dell, specifically because what they are doing with allowing musicians to create stuff is going to allow some pretty next-level art to happen. It’s the same thing with VR. What we can do, all that stuff happened on these really powerful gaming machines. The gaming machine, the graphics cards, and the GPUs you need, putting that into the music tech world, we can do some really crazy stuff.

Editors’ Recommendations

  • Razer Blade Stealth vs. Apple MacBook Pro
  • 5 reasons why the new Razer Blade looks amazing
  • Razer Blade Stealth vs. Dell XPS 13
  • Mohu Blade HD Antenna review
  • ZTE debuts budget-friendly Blade V9, and its first Android Go-powered phone



22
Jun

Bing Visual Search is a Google Lens competitor — with an extra feature


Microsoft wants searchers to be able to skip the keyboard and search not just with a photo, but within a specific part of that photo. Thanks to artificial intelligence, that feature is now arriving to the Bing app on iOS and Android. Visual Search, announced on Thursday, June 21, uses a camera or an existing photo to search or shop for objects, landmarks, and animals, or to scan a barcode. The Google Lens-like competitor is rolling out to the Bing app as well as Microsoft Launcher on Android and is also expected to head to Microsoft Edge and bing.com at a later date.

Visual Search uses a photo instead of a keyword to search the Bing platform, including both accessing existing photos and snapping a new photo in-app. Using object recognition powered by A.I., the tool can recognize a specific flower or a dog breed, along with recognizing places and landmarks. The Visual Search can also be used to shop, including taking a photo of a piece of apparel or furniture to find similar items.

Microsoft Bing

The tool is accessible from the camera icon inside the Bing app. For photos with multiple objects, the tool also includes an option to draw a box around the object that you would like to search for, instead of getting results for everything the program is capable of recognizing. That is one feature that may set the Bing Visual Search apart from other similar tools like Google Lens.

Microsoft says Visual Search expands on the A.I. already inside Bing Image Search, including a feature launched late last year for uploading a photo to find similar items in fashion and home furnishings.

Besides a photo being more descriptive than typing in keywords like “orange flower,” image-powered searches also help identify that item where the name slips your mind or that species of flower that you’re not familiar with. “Sometimes, it is almost impossible to describe what you want to search for using words,” Vince Leung, product lead for Bing Images, said in a blog post.

Bing isn’t the first platform to add the option to search with a camera — Google and Pinterest have similar image search options. Google Lens, along with flowers and landmarks, can also recognize books and album covers. The older but still young Lens can also read text in real time for tasks like taking a picture of a flyer to add an event. The iOS version inside Google Photos uses only an existing photo and not an in-app camera, however, and doesn’t have an option to specify one object in a photo of multiple items.

The Visual Search is rolling out now inside the Bing app on iOS and Android as well as Microsoft Launcher (Android only). Microsoft says the feature will also be coming to bing.com and Microsoft Edge.

Editors’ Recommendations

  • Here’s what Google Lens’ Style Match, Smart Text Selection features look like
  • Nikon developing a 500mm super telephoto with compact Fresnel design
  • Google Lens can now identify the breed of that cute dog you just saw
  • Google Lens is now available as a stand-alone app on the Google Play Store
  • Sigma 14-24mm F2.8 Art review



22
Jun

How to download and install MacOS Mojave today


Apple’s official release date for MacOS Mojave might be just around the corner, but you can get it today as long as you’re a member of Apple’s Developer program. Here’s how to join, and how to download MacOS Mojave today.

It is important to note, before you get started, that you can only install MacOS Mojave on any Mac introduced in the middle of 2010 or later, and on any MacBook or iMac introduced in late 2009 or later. Head here to see if your Mac qualifies.

Backup first!

Before we go any further, make sure you backup your files. For anything important that absolutely cannot be replaced, be sure you send it off to the cloud – Dropbox, iCloud, and OneDrive are great for this – or create a hard copy on a flash drive or external hard drive. While MacOS High Sierra went through numerous beta test phases to make sure it’s ready for prime time, every major update brings with it the risk of bugs that don’t show up until it hits widespread release. Backing up is a vital first step.

Sign up for the Apple Developer program

Be warned, this will cost you. A subscription for the Apple Developer program will run you about $100 a year. That’s not cheap, so if you’re having second thoughts, it wouldn’t be a bad idea to just wait for the public release this Fall. If you’re still willing to shell out to join the Developer program head here to get started.

You’ll want to hit the Enroll button on the top right of the page, sign in with your Apple account, and fill out any information they request. Once you’re enrolled, you just need to head over to developer.apple.com, click on the Downloads section. Here you’ll be given a list of all the available betas and tools. Click on the MacOS Mojave Developer Beta and a download will start.

Download and install!

Once the download finishes, you’re almost there. You just have to go find that file and fire it up. It’ll act as the ‘key’ to add the MacOS Mojave Developer Beta to your App Store. By default, the file will land in your Downloads folder, so navigate over there and open up the file. It should be named something like MacOS Developer Beta Access Utility.

Opening the file will present you with a package, and you just need to open that one up, agree to the terms and conditions, and you’re almost there. Next, the App Store will automatically open to the MacOS Mojave Developer Beta page. The OS update should start downloading and installing on its own. Your system might reboot a couple times once the install gets going, but afterward you’ll be all set.

Time to check out that sweet, sweet night mode.

Editors’ Recommendations

  • MacOS Mojave brings Dark Mode, stacking, and a redesigned App Store to Macs
  • How to enable dark mode in MacOS Mojave
  • iPhone apps are finally coming to your MacBook. Eventually. Sorta.
  • Microsoft Office 365 apps will hit the revamped Mac App Store later this year
  • Macs leak sensitive data from encrypted files, even after they’re deleted



22
Jun

In the future, potholes could be repaired by asphalt-printing drones


Driving on roads covered in potholes is no fun. At best, it can make your ride bumpier and less enjoyable. At worst, it can cause serious damage to your vehicle and, potentially, to its occupants. Couldn’t cutting-edge technology help? Quite possibly yes, claim researchers from the U.K. They have proposed an unorthodox approach to pothole repairs in which cameras equipped with image recognition technology constantly scan the streets for developing flaws, dispatch a drone to the site, and then use an on-board 3D printer to patch the hole with asphalt. Simple, right?

The concept is part of a larger, multi-university project looking at the possibility of self-repairing cities, and how robotics and other automated systems could be used to aid with repairs so as to cut down on disruptive road closures and other street works.

While it might sound like overkill to use drones, image recognition and 3D printing for a simple repair job, Phil Purnell, professor of Materials and Structures at the University of Leeds, told Digital Trends that these systems could actually save money in the long run. “When you look at interventions in infrastructure — whether it’s roads, pipes, bridges, or similar — you’re very often using ton and meter-scale solutions for problems that started out as gram and millimeter-scale defects,” he said.

In the case of potholes, that means that what begins as tiny coin-sized dents in the road can quickly grow in size as the result of weather and repeated vehicular activity. By using smart technology, the researchers think it can be nipped in the bud early on so as to avoid later problems. So far, researchers from University College London have successfully built an asphalt extruder, which has then been mounted onto a University of Leeds hybrid aerial-ground vehicle for transportation. It is capable of extruding asphalt with 1-millimeter accuracy.

The technology is certainly impressive, although Purnell noted that it’s still a long way off being deployed on roads. But what the work demonstrates is a proof of concept for how approaches such as this may be used in the future.

“From a technical view, this is like Formula 1,” he said. “Twenty years ago the idea of [technology such as] energy recovery through braking systems was something that was seen as exotic when it was used on Formula 1 cars. Now it’s commonplace in many hybrid vehicles that you can drive about on the road today. It’s the same thing here. This is all about demonstrating how we can glue the various pieces of this puzzle together. We’re academics, so it’s our job to look at the high concept approach. Through our interactions with industry, they’ll then be able to find ways of implementing it.”

Editors’ Recommendations

  • 14 major milestones along the brief history of 3D printing
  • Researchers can now create 3D-printed structures made entirely of liquid
  • From vaping to drones, 8 tech trends we may look back on and cringe
  • Awesome Tech You Can’t Buy Yet: Smart Rubik’s Cubes, diving drones, robot artists
  • World’s first 3D-printed cornea made from algae and human stem cells