Skip to content

Archive for

15
May

HP’s Envy curved AIOs sport a six-core CPU, Nvidia graphics, and maybe Alexa


HP is calling “world’s first” with the upcoming launch of its refreshed Envy curved all-in-one desktops with out-of-the-box support for Amazon Alexa. They have nothing to do with Amazon and Microsoft’s partnership to combine Alexa and Cortana in a future Windows 10 update. Instead, it’s part of a 2018 rollout originally announced by Acer, Asus, HP, and Lenovo in January to release PCs with a special Amazon Alexa app developed for Windows 10 launching “this spring.” 

At this time, HP doesn’t have a launch date or set configurations for the 34-inch model, but states that it will be released later this year. Meanwhile, the 27-inch model will hit the streets this month with a starting price of $1,399. But given that the Amazon Alexa app relies on Intel’s Smart Sound technology, we’re not exactly sure the 27-inch model actually supports the app. 

What we do know about the 27-inch version is that the two models presented on Monday, May 15 rely on Intel’s eighth-generation Core i7-8700T six-core processor with a base speed of 2.4GHz and a maximum speed of 4.0GHz. It’s joined by Nvidia’s discrete GeForce GTX 1050 graphics chip with 4GB of dedicated graphics memory. That doesn’t mean HP’s AIO is a smoking-hot gaming machine, but the GTX 1050 is definitely better than relying on integrated graphics. 

Both machines also rely on the same amount of DDR4 memory: 16GB clocked at 2,400MHz. They have dual-storage configurations, too, relying on a PCIe NVMe M.2 SSD as the operating system drive and a mechanical hard drive spinning at 5,200RPM for storing data. That said, both have a 256GB SSD while the B210 sports a 1TB drive and the B214 provides a2TB hard drive. 

On the connectivity front, both ship with five USB-A ports (5Gbps), one Thunderbolt 3 port (~40Gbps), one HDMI input port, one HDMI output port, a headphone / microphone combo jack, an SD card reader, and an Ethernet port. Other shared features between the two consist of Bluetooth 4.2 and Wireless AC at 867Mbps (2×2) connectivity, an HP TrueVision FHD IR camera, and a sound bar packed with front-facing speakers. 

Outside the storage aspect, the only difference between the B210 and the B214 are their display resolutions. Both rely on an IPS panel for rich colors and wide viewing angles, but the B210 sports a 3,840 x 2,160 resolution with 10-point touch and the B214 packs a lower 2,560 x 1,440, also with 10-point touch. You’re seemingly halving the hard drive storage in the B210 to get the higher resolution tradeoff. 

Meanwhile, HP has yet to launch its refreshed Pavilion Wave speaker-like triangular desktop PC sporting the new Amazon Alexa app for Windows 10.

“It now will offer a custom LED to indicate Alexa is listening,” the company said in January. “Working with Intel and Amazon engineers to optimize Intel Smart Sound Technology to help deliver a hands-free experience, the 360-degree multi-directional reflective audio can hear voice commands from any angle.” 

HP introduced a huge lineup of devices on Monday, May 14 including the Elite 1000 Series laptops and desktops, as well as an updated Envy portfolio. 

Editors’ Recommendations

  • Lenovo takes wraps off Windows 10, Chromebook device lineup at MWC
  • Origin PC’s latest notebook packs Intel’s Core i7-8750H CPU, Nvidia Max-Q GPU
  • HP’s mainstream Pavilion PCs refreshed with latest AMD Ryzen, Intel Core CPUs
  • You can stuff a hefty Core i9 six-core CPU in Dell’s new refreshed XPS 15 laptop
  • Acer’s stylish new all-in-one PC packs a built-in Qi wireless charging station


Advertisements
15
May

Another Facebook privacy scandal — 3 million users’ data exposed by quiz


Justin Sullivan/Getty Images

Facebook is once again at the center of a scandal over data mining on its platform, after it was discovered that another personality quiz hosted on the social network harvested the personal information of some three million people. The data was only supposed to be accessible through an approved research platform, but has since been discovered on a website with little to no protection.

Facebook has been embroiled in a number of data protection and privacy scandals in recent months, most notably surrounding the alleged election-interfering firm Cambridge Analytica. Much of the concern over that company’s involvement with Facebook was in how it harvested the data of users through quizzes on the site under the guise of academic research. It seems now that it wasn’t the only one.

In the case of this new data scandal, the “myPersonality” quiz was used to collect various pieces of information about users who took part in a psychology quiz on Facebook. Some half of those who took part allegedly gave their permission for data to be shared with third-party researchers. New Scientist reports that some 280 people at different technology companies were given access and, somewhere along the way, that data ended up on a website that was very insecure. It was password protected, but those login credentials were said to be easily found with a simple web search.

Although not as expansive as the data exposed in the Cambridge Analytica scandal, this latest leak did still contain results of the personality quiz, as well as personal Facebook details and even status updates of 150,000 users. The strongest link with the earlier data harvesting, though, is that the University of Cambridge’s Psychometrics Center controlled both data sets. Alexandr Kogan, who is a core component of the Cambridge Analytica scandal, was part of this project as well.

Another striking element of this story is how far back it goes. The Verge highlights that the data collection project may have begun in 2009 and there was some discussion of Cambridge Analytica acquiring the data, though apparently it was turned down due to its involvement in politics.

Facebook’s response to the story has led to a promised investigation into the myPersonality quiz and associated apps. Facebook has so far suspended 200 apps that could be involved in data harvesting in this manner. New Scientist however, highlights that Facebook has been aware of the quiz since as far back as 2011.

Want to improve your privacy on Facebook? Here are our top tips.

Editors’ Recommendations

  • Facebook was always too busy selling ads to care about your personal data
  • Another Facebook quiz could have stolen data under the guise of research
  • 9 things to know about Facebook privacy and Cambridge Analytica
  • Cambridge Analytica’s ex-director wants to fix data privacy. Can we trust her?
  • Localblox data breach is the latest nightmare for Facebook, LinkedIn


15
May

New Adobe XD Starter Plan opens user experience design to all


Interested in checking out Adobe XD but don’t want to drop the cash for it? You’re in luck. On Tuesday, May 15, Adobe launched the XD CC Starter Plan, a completely free version of the user experience design app made both for those new to UX design and collaborators in existing projects. The Starter Plan comes with a handful of updates to XD, including enhanced compatibility with Photoshop.

With the free access to XD, Adobe says that more team members will be able to access files without an XD subscription, along with providing early access to students and users new to UX design. The Starter Plan includes access to the top design, prototype, and share features for both Mac and Windows as well as mobile and web. However, it limits the number of prototypes and design specs that can be published for collaboration at one time and includes just 2 GB of storage. Other assets are also limited, such as the font sets. 

Khoi Vinh, a principal designer for Adobe XD, said that the XD CC Starter Plan lets users without “designer” in their title to collaborate on team projects in richer ways, allowing marketers, strategists and others in the team to access more than just the project preview shared online. The XD CC Starter Plan is also designed for those that are just toying with the idea of creating an app, Vinh said.

Adobe XD has only been out of beta for about six months, after Adobe spent nearly two years working with the UX design community to create the app. It includes workflow tools designed to streamline the creation process, including a design specs panel that links the fonts, colors, and symbols to apply changes to multiple elements at once.

The full Adobe XD is also getting a handful of updates beginning today. The previously announced compatibility with Sketch and Photoshop CC expands with access to stroke and image effects without leaving XD. The update also adds the option to paste to multiple art boards.

Inside design specs, the tool can now be password protected before sharing the design with other team members. Drag-and-drop symbols will also replace repeating symbols in one move, rather than replacing each symbol individually. The update also includes a number of free UI kits, or asset packs that help designers get started, like a pack with tools for designing a smartwatch app, another focused on travel, and another on cryptocurrency.

During the demonstration of the latest update, Adobe also teased a handful of features that to come later, like timed interactions for tasks like adding on-screen instructions after a period of inactivity and automatic animations. Overlay support, a feature common for tasks such as laying a menu over an existing screen, was also among the sneak peeks.

“UX design is one of the fastest growing segments in design,” Scott Belsky, chief product officer and executive vice president for Creative Cloud at Adobe, said in a statement. “The new Starter plan for Adobe XD supports Adobe’s vision to give everyone — from emerging artists to enterprise brands — everything they need to design and deliver exceptional digital experiences and explore the rapidly expanding field of UX design with no financial commitment.”

The Starter Plan for XD is only the beginning, however — Adobe also announced the Adobe Fund for Design, a $10 million dollar fund that will create grants and equity investments for designers and developers. The fund is designed to help support innovators in the experience design field as well as innovations on creating integrations and plug-ins for adding third-party tools to Adobe XD.

Editors’ Recommendations

  • Adobe enables faster workflow with updates to XD, InDesign, and Illustrator
  • Have a design that bears repeating? CorelDraw’s new symmetry tool makes it easy
  • Vimeo’s new Final Cut Pro app cuts the distractions, expands export tools
  • How Porsche Design’s unwavering passion shaped the Mate RS phone
  • Need Photoshop? Here’s how to get it without overpaying


15
May

New Adobe XD Starter Plan opens user experience design to all


Interested in checking out Adobe XD but don’t want to drop the cash for it? You’re in luck. On Tuesday, May 15, Adobe launched the XD CC Starter Plan, a completely free version of the user experience design app made both for those new to UX design and collaborators in existing projects. The Starter Plan comes with a handful of updates to XD, including enhanced compatibility with Photoshop.

With the free access to XD, Adobe says that more team members will be able to access files without an XD subscription, along with providing early access to students and users new to UX design. The Starter Plan includes access to the top design, prototype, and share features for both Mac and Windows as well as mobile and web. However, it limits the number of prototypes and design specs that can be published for collaboration at one time and includes just 2 GB of storage. Other assets are also limited, such as the font sets. 

Khoi Vinh, a principal designer for Adobe XD, said that the XD CC Starter Plan lets users without “designer” in their title to collaborate on team projects in richer ways, allowing marketers, strategists and others in the team to access more than just the project preview shared online. The XD CC Starter Plan is also designed for those that are just toying with the idea of creating an app, Vinh said.

Adobe XD has only been out of beta for about six months, after Adobe spent nearly two years working with the UX design community to create the app. It includes workflow tools designed to streamline the creation process, including a design specs panel that links the fonts, colors, and symbols to apply changes to multiple elements at once.

The full Adobe XD is also getting a handful of updates beginning today. The previously announced compatibility with Sketch and Photoshop CC expands with access to stroke and image effects without leaving XD. The update also adds the option to paste to multiple art boards.

Inside design specs, the tool can now be password protected before sharing the design with other team members. Drag-and-drop symbols will also replace repeating symbols in one move, rather than replacing each symbol individually. The update also includes a number of free UI kits, or asset packs that help designers get started, like a pack with tools for designing a smartwatch app, another focused on travel, and another on cryptocurrency.

During the demonstration of the latest update, Adobe also teased a handful of features that to come later, like timed interactions for tasks like adding on-screen instructions after a period of inactivity and automatic animations. Overlay support, a feature common for tasks such as laying a menu over an existing screen, was also among the sneak peeks.

“UX design is one of the fastest growing segments in design,” Scott Belsky, chief product officer and executive vice president for Creative Cloud at Adobe, said in a statement. “The new Starter plan for Adobe XD supports Adobe’s vision to give everyone — from emerging artists to enterprise brands — everything they need to design and deliver exceptional digital experiences and explore the rapidly expanding field of UX design with no financial commitment.”

The Starter Plan for XD is only the beginning, however — Adobe also announced the Adobe Fund for Design, a $10 million dollar fund that will create grants and equity investments for designers and developers. The fund is designed to help support innovators in the experience design field as well as innovations on creating integrations and plug-ins for adding third-party tools to Adobe XD.

Editors’ Recommendations

  • Adobe enables faster workflow with updates to XD, InDesign, and Illustrator
  • Have a design that bears repeating? CorelDraw’s new symmetry tool makes it easy
  • Vimeo’s new Final Cut Pro app cuts the distractions, expands export tools
  • How Porsche Design’s unwavering passion shaped the Mate RS phone
  • Need Photoshop? Here’s how to get it without overpaying


15
May

With ChroMorphous technology, clothes may soon change color with a tap of a phone


With a simple tap on our smartphone, you’re able to control a lot around us — whether it’s the music playing from your Bluetooth speaker, the Netflix show on your TV, or even your AC unit. But what if you could also use your phone to alter what your clothes looked like? That’s where ChroMorphous technology — an active, user-controlled color-changing fabric — comes in.

Developed by a team of research scientists at The College of Optics and Photonics (known as CREOL) at The University of Central Florida, the new technology allows users to control and switch up the pattern on fabric using a mobile app.

Behind the tech

CREOL claims the technology differs from “color-changing” fabrics currently on the market today. Rather than LED tricks emitting light, the fabric physically changes color because of the fiber itself. Produced in Melbourne, Florida, the fibers are created from raw materials using machines that spin fibers for textiles. After shipping the fibers to the mills, it’s weaved using regular industrial-scale weaving machines and then the fabric is shipped back to the CREOL team. While the fabric is produced with traditional machinery, the machine is hacked to run a micro-wire inside each thread so that running a current raises the local temperature a little bit — which is enough for a thermochromic pigment to switch color. Basically, the pigments embedded in the thread respond to a switch in temperature and then change color.

“When you think of anything you work with on a daily basis, whether it’s how you communicate with people [or] buy stuff, it’s infused with technology … The goal here is to now take fabrics and textiles and garments and say let’s infuse that with technology,” Dr. Ayman Abouraddy, professor of optics and photonics at UFC, told Digital Trends, “I want to be able to communicate with it, I want to tell it how to look, and nothing like that has happened yet. Color-changing is one thing, but our vision is to do much more than that as well.”

Upon first glance, the fabric doesn’t look any different than what you’d see on a regular canvas tote bag or even on shoes. Running our hand along the sample, we couldn’t tell that there was any special technology embedded throughout it other than the battery attached on the side. But we did notice it felt rougher in comparison to fabric such as cotton or polyester, which can play a huge role in whether or not a consumer chooses to buy an item they’ll be wearing. But the team explained it’s due to the larger thread, which makes it feel coarse. Currently, CREOL is working toward producing thinner fibers in order for the material to feel smoother and more flexible. The hope is that consumers eventually won’t be able to tell the difference between the technological fabric and regular fabric currently on the market.

“Even Google tried this, and they went the usual route of weaving a metal wire within clothing. Here, the reason it looks so natural is [because] the wire is inside the fiber, so you never touch it, you never come in contact with it,” Abouraddy explained. While Google may have yet to find a way to incorporate color-changing technology within clothing, the company did recently partner with Levi’s to create a wearable electronic jacket. Currently available for purchase, users can control things like music playback and navigation with simple gestures while wearing the denim jacket.

With a collective background in optical fiber manufacturing technology, the team at CREOL admitted this technology isn’t new but it hasn’t gone further than scientific publications. While CREOL claims ChroMorphous is the first of its kind, fashion designer Julianna Bass incorporated a similar concept into her clothing line in September. Even though it wasn’t controlled through an app, the technology includes thermochromic inks that allow for consumers to trigger a color change via a soft button. It has yet to hit the market.

Sitting down with ChroMorphous

After setting up the app on our iPhone, we were able to watch the fabric switch from a solid color to a pattern by simply tapping on a button. While the interface is still in its infancy, the ChroMorphous team is working to develop an app that gives the user full control over the patterns they can choose. As of right now, the app connects to your phone via Wi-Fi, but there are plans to add Bluetooth in the future. Once incorporated into designers’ products, the researchers hope for consumers to use it the same way they would use a Fitbit or Apple Watch — by purchasing the product, downloading the app, and taking advantage of it.

“We hope that in the future people will assume clothes can do more than just protect [you] from the environment. You assume it can do more – so you put on a garment and then look on your phone and see what is available for it do” Abouraddy said, “We are carrying phones anyway — what happened is they put extra stuff in the phone … a camera so now you don’t have to carry a camera. … We wear clothes anyway, so we’re adding functionality to it.”

After opening the app, we had the option adding solid, stripes, or random. We chose the option to add stripes on the fabric and watched every other strip go from dark green to a lightly colored green instead. But each strip is individually controllable depending on the programming — meaning the team can instead allow users to pick specific strips to change color via the app. With the random option, the colors of the strips will change over time.

Seeing as how the piece of fabric from the demonstration was larger than what you’d see on a handbag or backpack, it did take about 45 seconds to fully switch over to the new pattern. But the switch can be quicker or slower depending on how much juice you have in your battery. If you need the pattern or color to change quickly, then you’ll take more energy out of your battery than you would by allowing it to change at a slower pace.

When watching the tote bag switch from a solid color to stripes, our initial thought was the amount of money we’d save by no longer purchasing more than one of an item we love in multiple colors. While changing the design of the fabric or color doesn’t completely change the look of the bag, it does provide consumers the opportunity to get even more use out of it than they normally would — knowing they’ll be able to incorporate it into more outfits by customizing it. It can also help during those rare, but realistic, occurrences where you’re wearing the same outfit or item as someone else in the room.

The fabric is also washable — as long as you remove the detachable battery and connector, you’ll be good to go. But CREOL does recommend hanging it to dry rather than using the dryer. If there’s any wrinkles, you can iron it with a steam iron as well.

Wearing ChroMorphous technology

We were excited about the prospect of being able to switch up our wardrobe without having to buy more clothing, but it did leave us with one question — where do you place the battery while wearing the fabric? When it comes to a tote bag or backpack, the battery will sit in an inner pouch or pocket. The team wasn’t so sure when it came to a clothing items such as a dress, suggesting that it may sit on the bra strap but expressing that tests need to be done based on comfort.  Then there’s also the possibility of wanting an even bigger battery to power a large item quickly. Questions such as these are what will most likely keep ChroMorphous technology from not hitting the consumer market until next year, as the team works to weed out consumer pain points.

“Somebody had to take that step and bridge that gap between these scientific demos and an actual large scale … it’s now a real technology, not a scientific achievement. And our goal … is to have the designers — the actual producers of textile-based products for the market — be aware of it, and [think] of incorporating it into their product” Abourdaddy said.

While the future of ChroMorphous technology is still unclear, CREOL knows one thing is for certain — it’s officially ready for designers to start using it to produce clothing, accessories, and more. The technology is currently scalable at mass-production levels and doesn’t stop at clothing and accessorie — it can also be used for interior design such as furniture and fixed installations.

Editors’ Recommendations

  • Research Scientists Debut ChroMorphous, A Color-Changing Technology
  • I wore Levi’s smart jacket for three months, and it changed how I use my phone
  • Charge your devices simply by plugging them into your Radius backpack
  • The best smart luggage
  • Awesome Tech You Can’t Buy Yet: Glamping hammocks, plasma lighters, and more


15
May

With ChroMorphous technology, clothes may soon change color with a tap of a phone


With a simple tap on our smartphone, you’re able to control a lot around us — whether it’s the music playing from your Bluetooth speaker, the Netflix show on your TV, or even your AC unit. But what if you could also use your phone to alter what your clothes looked like? That’s where ChroMorphous technology — an active, user-controlled color-changing fabric — comes in.

Developed by a team of research scientists at The College of Optics and Photonics (known as CREOL) at The University of Central Florida, the new technology allows users to control and switch up the pattern on fabric using a mobile app.

Behind the tech

CREOL claims the technology differs from “color-changing” fabrics currently on the market today. Rather than LED tricks emitting light, the fabric physically changes color because of the fiber itself. Produced in Melbourne, Florida, the fibers are created from raw materials using machines that spin fibers for textiles. After shipping the fibers to the mills, it’s weaved using regular industrial-scale weaving machines and then the fabric is shipped back to the CREOL team. While the fabric is produced with traditional machinery, the machine is hacked to run a micro-wire inside each thread so that running a current raises the local temperature a little bit — which is enough for a thermochromic pigment to switch color. Basically, the pigments embedded in the thread respond to a switch in temperature and then change color.

“When you think of anything you work with on a daily basis, whether it’s how you communicate with people [or] buy stuff, it’s infused with technology … The goal here is to now take fabrics and textiles and garments and say let’s infuse that with technology,” Dr. Ayman Abouraddy, professor of optics and photonics at UFC, told Digital Trends, “I want to be able to communicate with it, I want to tell it how to look, and nothing like that has happened yet. Color-changing is one thing, but our vision is to do much more than that as well.”

Upon first glance, the fabric doesn’t look any different than what you’d see on a regular canvas tote bag or even on shoes. Running our hand along the sample, we couldn’t tell that there was any special technology embedded throughout it other than the battery attached on the side. But we did notice it felt rougher in comparison to fabric such as cotton or polyester, which can play a huge role in whether or not a consumer chooses to buy an item they’ll be wearing. But the team explained it’s due to the larger thread, which makes it feel coarse. Currently, CREOL is working toward producing thinner fibers in order for the material to feel smoother and more flexible. The hope is that consumers eventually won’t be able to tell the difference between the technological fabric and regular fabric currently on the market.

“Even Google tried this, and they went the usual route of weaving a metal wire within clothing. Here, the reason it looks so natural is [because] the wire is inside the fiber, so you never touch it, you never come in contact with it,” Abouraddy explained. While Google may have yet to find a way to incorporate color-changing technology within clothing, the company did recently partner with Levi’s to create a wearable electronic jacket. Currently available for purchase, users can control things like music playback and navigation with simple gestures while wearing the denim jacket.

With a collective background in optical fiber manufacturing technology, the team at CREOL admitted this technology isn’t new but it hasn’t gone further than scientific publications. While CREOL claims ChroMorphous is the first of its kind, fashion designer Julianna Bass incorporated a similar concept into her clothing line in September. Even though it wasn’t controlled through an app, the technology includes thermochromic inks that allow for consumers to trigger a color change via a soft button. It has yet to hit the market.

Sitting down with ChroMorphous

After setting up the app on our iPhone, we were able to watch the fabric switch from a solid color to a pattern by simply tapping on a button. While the interface is still in its infancy, the ChroMorphous team is working to develop an app that gives the user full control over the patterns they can choose. As of right now, the app connects to your phone via Wi-Fi, but there are plans to add Bluetooth in the future. Once incorporated into designers’ products, the researchers hope for consumers to use it the same way they would use a Fitbit or Apple Watch — by purchasing the product, downloading the app, and taking advantage of it.

“We hope that in the future people will assume clothes can do more than just protect [you] from the environment. You assume it can do more – so you put on a garment and then look on your phone and see what is available for it do” Abouraddy said, “We are carrying phones anyway — what happened is they put extra stuff in the phone … a camera so now you don’t have to carry a camera. … We wear clothes anyway, so we’re adding functionality to it.”

After opening the app, we had the option adding solid, stripes, or random. We chose the option to add stripes on the fabric and watched every other strip go from dark green to a lightly colored green instead. But each strip is individually controllable depending on the programming — meaning the team can instead allow users to pick specific strips to change color via the app. With the random option, the colors of the strips will change over time.

Seeing as how the piece of fabric from the demonstration was larger than what you’d see on a handbag or backpack, it did take about 45 seconds to fully switch over to the new pattern. But the switch can be quicker or slower depending on how much juice you have in your battery. If you need the pattern or color to change quickly, then you’ll take more energy out of your battery than you would by allowing it to change at a slower pace.

When watching the tote bag switch from a solid color to stripes, our initial thought was the amount of money we’d save by no longer purchasing more than one of an item we love in multiple colors. While changing the design of the fabric or color doesn’t completely change the look of the bag, it does provide consumers the opportunity to get even more use out of it than they normally would — knowing they’ll be able to incorporate it into more outfits by customizing it. It can also help during those rare, but realistic, occurrences where you’re wearing the same outfit or item as someone else in the room.

The fabric is also washable — as long as you remove the detachable battery and connector, you’ll be good to go. But CREOL does recommend hanging it to dry rather than using the dryer. If there’s any wrinkles, you can iron it with a steam iron as well.

Wearing ChroMorphous technology

We were excited about the prospect of being able to switch up our wardrobe without having to buy more clothing, but it did leave us with one question — where do you place the battery while wearing the fabric? When it comes to a tote bag or backpack, the battery will sit in an inner pouch or pocket. The team wasn’t so sure when it came to a clothing items such as a dress, suggesting that it may sit on the bra strap but expressing that tests need to be done based on comfort.  Then there’s also the possibility of wanting an even bigger battery to power a large item quickly. Questions such as these are what will most likely keep ChroMorphous technology from not hitting the consumer market until next year, as the team works to weed out consumer pain points.

“Somebody had to take that step and bridge that gap between these scientific demos and an actual large scale … it’s now a real technology, not a scientific achievement. And our goal … is to have the designers — the actual producers of textile-based products for the market — be aware of it, and [think] of incorporating it into their product” Abourdaddy said.

While the future of ChroMorphous technology is still unclear, CREOL knows one thing is for certain — it’s officially ready for designers to start using it to produce clothing, accessories, and more. The technology is currently scalable at mass-production levels and doesn’t stop at clothing and accessorie — it can also be used for interior design such as furniture and fixed installations.

Editors’ Recommendations

  • Research Scientists Debut ChroMorphous, A Color-Changing Technology
  • I wore Levi’s smart jacket for three months, and it changed how I use my phone
  • Charge your devices simply by plugging them into your Radius backpack
  • The best smart luggage
  • Awesome Tech You Can’t Buy Yet: Glamping hammocks, plasma lighters, and more


15
May

When you can’t take the studio with you, Alfred AI fixes bad lighting on phone photos


While smartphone cameras are pocketable, professional lighting equipment is not — so one company is looking to artificial intelligence to mimic photographic lighting with just an app. Relonch Alfred is an iOS app that uses machine learning to correct badly lit photos, adjusting each area of the image separately to correct dark, imbalanced snapshots. By creating exposure and color maps, Alfred can correct the lighting using a single slider instead of multiple controls and localized edits, all without requiring a dual lens camera. The app, currently in a demo stage, was officially announced May 15.

Alfred comes from Relonch, the company that created the A.I.-powered camera that edits photos for you. With Alfred, the company’s A.I. editing moves beyond the Relonch camera to smartphones.

Instead of the depth maps created from dual-lens cameras, Alfred creates a color and exposure map of the photo, noting where the lightest and darkest areas of the image are located. That data allows the software to apply localized edits with a single slider adjustment, brightening up the darkest portions of the image while leaving the properly exposed areas of the image untouched.

Alfred, named after photographer Alfred Stieglitz, is based on some of the same A.I. behind the Relonch camera. A 25-member editing team took nearly two years to manually edit 100,000 photos. Those manual adjustments were fed to the machine learning software to teach the program how to create localized adjustments (or edits that are made only to a specific portion of the photo instead of the entire image). Once the program was created, the team also went back and re-edited photos in order to improve the A.I..

The exposure and color maps are the key to how Alfred compares to other photo editing apps. Co-founder and COO Yuriy Motin said that counter to most other apps, Alfred looks at light, rather than using filters, to edit the image. That’s an important difference, he says, because light works to create the mood of the story within a photo.

Editing bad lighting is just the start for the app that’s now only in demo stages. While the light editing feature works off a cloud-based A.I., the team is working on moving the A.I. to the device so photos can be processed locally. The team is also working on a beta version of the software that works on video while it’s being recorded, bringing the same automatic exposure corrections used in still photos to video.

While the app doesn’t require any special kind of camera, Relonch is developing a version that will make use of depth maps on dual-lens smartphones. By mixing the exposure mapping with depth mapping, the software can recreate cinematic lighting effects, such as backlighting or stage lighting. Unlike the Portrait Mode on the iPhone X, Relonch’s A.I. allows the feature to work with any subject, not just people, and applies the lighting effect to the entire scene.

“The key part of our technology is the intelligence of our A.I. that adds drama into each story, filling it with an atmosphere the user originally intended, while keeping the image natural looking,” said Sergey Korzhenevich, Relonch co-founder and CEO. “Imagine a professional photo or video crew where the gaffer holds up light sources to remove shadows and harshness — our technology does all of that behind the scenes.”

Relonch Alfred is currently available on the App Store as a demo version with the ability to edit light on existing photographs. The team plans to add video compatibility over the next few weeks, while the ability to recreate light using depth maps is slated for a fall arrival. The team also plans to release an Android app in the future.

Alfred is currently free, but as the app expands, additional features — such as an option to edit multiple images at once — will roll out as an in-app purchases.

Editors’ Recommendations

  • Vivo injects A.I. into its new Super HDR photo tech for more beautiful pictures
  • The Illuminati beams light data from multiple points to the photographer
  • Photo FOMO: VSCO gets advanced color edits, Samyang’s 50mm for 50 MP
  • There’s an entire photo studio in your phone with this AR app
  • Lytro is calling it quits, but says light field tech will live on


15
May

When you can’t take the studio with you, Alfred AI fixes bad lighting on phone photos


While smartphone cameras are pocketable, professional lighting equipment is not — so one company is looking to artificial intelligence to mimic photographic lighting with just an app. Relonch Alfred is an iOS app that uses machine learning to correct badly lit photos, adjusting each area of the image separately to correct dark, imbalanced snapshots. By creating exposure and color maps, Alfred can correct the lighting using a single slider instead of multiple controls and localized edits, all without requiring a dual lens camera. The app, currently in a demo stage, was officially announced May 15.

Alfred comes from Relonch, the company that created the A.I.-powered camera that edits photos for you. With Alfred, the company’s A.I. editing moves beyond the Relonch camera to smartphones.

Instead of the depth maps created from dual-lens cameras, Alfred creates a color and exposure map of the photo, noting where the lightest and darkest areas of the image are located. That data allows the software to apply localized edits with a single slider adjustment, brightening up the darkest portions of the image while leaving the properly exposed areas of the image untouched.

Alfred, named after photographer Alfred Stieglitz, is based on some of the same A.I. behind the Relonch camera. A 25-member editing team took nearly two years to manually edit 100,000 photos. Those manual adjustments were fed to the machine learning software to teach the program how to create localized adjustments (or edits that are made only to a specific portion of the photo instead of the entire image). Once the program was created, the team also went back and re-edited photos in order to improve the A.I..

The exposure and color maps are the key to how Alfred compares to other photo editing apps. Co-founder and COO Yuriy Motin said that counter to most other apps, Alfred looks at light, rather than using filters, to edit the image. That’s an important difference, he says, because light works to create the mood of the story within a photo.

Editing bad lighting is just the start for the app that’s now only in demo stages. While the light editing feature works off a cloud-based A.I., the team is working on moving the A.I. to the device so photos can be processed locally. The team is also working on a beta version of the software that works on video while it’s being recorded, bringing the same automatic exposure corrections used in still photos to video.

While the app doesn’t require any special kind of camera, Relonch is developing a version that will make use of depth maps on dual-lens smartphones. By mixing the exposure mapping with depth mapping, the software can recreate cinematic lighting effects, such as backlighting or stage lighting. Unlike the Portrait Mode on the iPhone X, Relonch’s A.I. allows the feature to work with any subject, not just people, and applies the lighting effect to the entire scene.

“The key part of our technology is the intelligence of our A.I. that adds drama into each story, filling it with an atmosphere the user originally intended, while keeping the image natural looking,” said Sergey Korzhenevich, Relonch co-founder and CEO. “Imagine a professional photo or video crew where the gaffer holds up light sources to remove shadows and harshness — our technology does all of that behind the scenes.”

Relonch Alfred is currently available on the App Store as a demo version with the ability to edit light on existing photographs. The team plans to add video compatibility over the next few weeks, while the ability to recreate light using depth maps is slated for a fall arrival. The team also plans to release an Android app in the future.

Alfred is currently free, but as the app expands, additional features — such as an option to edit multiple images at once — will roll out as an in-app purchases.

Editors’ Recommendations

  • Vivo injects A.I. into its new Super HDR photo tech for more beautiful pictures
  • The Illuminati beams light data from multiple points to the photographer
  • Photo FOMO: VSCO gets advanced color edits, Samyang’s 50mm for 50 MP
  • There’s an entire photo studio in your phone with this AR app
  • Lytro is calling it quits, but says light field tech will live on


15
May

Oracle persuades Australia to examine Google’s data-tracking practices


Google and Oracle have long been engaged in a range of legal battles, and now, the latest iteration is playing out in Australia, where Oracle has successfully convinced competition and privacy regulators to look into how Google allegedly tracks its Android phone users. Oracle claims that Android phones send information to Google about where their owners are located, even if location services are switched off and there is no SIM card present.

These accusations first came to light in November, and while the source of the claim was initially anonymous, the Federal Trade Commission’s chief technology Ashkan Soltani suggested that Oracle was behind the story, and had been attempting to plant this particular seed for months. And now, Oracle isn’t holding back at all. In a presentation to the Australian Competition and Consumer Commission (ACCC), the company not only claims that Android phones are sending location data to Google, but also that these devices are telling the internet giant what searches and websites users frequent.

“The ACCC met with Oracle and is considering information it has provided about Google services,” said the chairman of the ACCC, Rod Sims. “We are exploring how much consumers know about the use of location data and are working closely with the privacy commissioner.”

The Office of the Australian Information Commissioner added that it is “making inquiries with Google.”

Google, for its part, certainly isn’t backing down. “Google is completely focused on protecting our users’ data while making the products they love work better for them. Users can see what data is collected and how it’s used in one easy place, My Account, and control it all from there,” the company said. “Like many of Oracle’s corporate tactics, this presentation is sleight of hand, not facts, and given that Oracle markets itself as the world’s biggest data broker, they know it.”

“Any location data that is sent back to Google location servers is anonymized and is not tied or traceable to a specific user,” Google added.

This legal battle will doubtless play out over the next several weeks and months, and we’ll be sure to keep you updated as we learn more.

Editors’ Recommendations

  • Google loses copyright appeal against Oracle, may owe billions
  • Top family-friendly Google Play Store apps track children’s personal data
  • Google merges Android Pay and Google Wallet into Google Pay
  • North Carolina police force asks Google for data from devices near crime scenes
  • Everything you need to know about Google Assistant


15
May

Is Google Duplex an exciting or terrifying piece of technology?


Duplex is coming — are you ready?

Last week during its I/O developer conference, Google unveiled Google Duplex – a technology that uses AI to call businesses on your behalf for scheduling reservations, appointments, etc.

google-io-2018-assistant-calling-custome

Google Duplex was this year’s biggest announcement, and while I’m beyond excited to see where it takes us, that optimism doesn’t hold true for everyone else.

There’s a mixed bag of doubt, excitement, and worry in regards to people’s reactions to Duplex, and to get a small taste of that, this is what some of the Android Central forum members had to say.

avatar2369890_1.gifjhimmel
05-09-2018 09:43 PM

Going to be annoying for businesses receiving calls that are guaranteed to be a frustrating failure. My prediction.

Reply

avatar2882902_2.gifDMP89145
05-09-2018 11:08 PM

Cool AF! 😎

I absolutely love the way Google is pushing tech. Being a little uncomfortable is a great sign that we are entering new territory and it’s time to move on to the next great chapter in technology.

Reply

avatar1093536_22.gifGolfdriver97
05-10-2018 01:57 PM

Overall, I think it’s good for the advancement of AI. But it does seem rather creepy. Probably one of those things we will get used to. Similar to people talking to their phone but not to another person.

Reply

avatar2807784_1.gifWbutchart1
05-14-2018 03:32 AM

I dont understand why this is creepy. What harm does it do to talk to a machine on the other end and not realise? How could that upset, hurt or cause harm to another person? I dont understand this reaction in the media at all.

In the UK I’m forced to speak to machines all the time before I get to speak to a person, I’d love it if they sounded more human, intelligent, understood me and were far…

Reply

What’s your take on all this? Do you think Google Duplex is a cool or creepy use of tech?

Join the conversation in the forums!

%d bloggers like this: