Skip to content

Archive for

4
May

Facebook wants to make your virtual self appear as real as possible in VR


Facebook

Facebook wants to transport your physical self into the virtual world by allowing you to create photorealistic avatars for virtual reality headsets. The technology was unveiled by Facebook chief technology officer Mike Schroepfer during the second day of the company’s developer-centric F8 conference in Silicon Valley, California.

The company has made progress over the years to evolve its avatar creation technology to make digital representations more lifelike and emotive. Facebook originally represented avatars as a simple blue face, but the technology eventually allowed users to personalize their virtual selves with more details and lifelike features. The result is still a cartoon-like representation of an individual, and it is not dissimilar to Snap’s personalized Bitmoji or Samsung’s AR emoji. Creating a photorealistic avatar is a logical next step for Facebook’s VR journey, as the company hopes that advancing technologies will help blur the borders between the real and virtual worlds.

Facebook didn’t detail much about its work with photorealistic avatars. We learned from demos that the company is using motion-capture technology to map facial images from photographs to capture various points on a user’s face. By carefully mapping a user’s face and facial characteristics, Facebook can synchronize facial movements and expressions in real life with the photorealistic avatar in VR.

In late 2016, Facebook began experimenting with bringing more emotions into its VR avatar experience, according to Techristic. The company used body tracking to allow users to show that they were racing their fists in the air, demonstrating an angry expression, or a shoulder shrug. Photorealistic avatars with face tracking could allow Facebook to inject even more emotions into our digital personas in the virtual world. During his demo, Schroepfer showed that when an Oculus employee with a Rift headset spoke, his avatar’s mouth moved in unison with his speech.

Still, we don’t know when photorealistic avatars will arrive for consumers. New hardware may also be required, according to Upload VR, that will allow Oculus to track eye movements, which could allow our avatars to blink when we do. Similarly, Midmaze has been working on its sensor mask, which is designed to sit between your face and a VR headset, like the Oculus Rift, to track your facial movements and display them in real time in the virtual world.

Photorealistic avatars could help to further define and transform the virtual and augmented reality experiences that we have today, allowing users from remote locations to experience games and movies together as if they’re in the same room. If you look over and see a photorealistic image of your friend enjoying the same experience as you in the VR space, it would make virtual experiences feel more convincing.

Editors’ Recommendations

  • More than 1,000 experiences are available for the Oculus Go VR headset
  • HTC Vive review
  • $200 Oculus Go VR headset hits Amazon
  • 8 Amazing accessories that could make virtual reality even more immersive
  • Qualcomm’s Snapdragon 845 VR reference headset puts body tracking in mobile VR


4
May

Oculus’ new prototype VR headset has something the HTC Vive Pro doesn’t


The Vive Pro certainly impressed us when we got our hands on HTC’s new top-tier VR headset, but there was one thing that a lot of people wished it had expanded upon: The field of view. Fortunately, Oculus seems to have read everyone’s minds and has been working away at that for some time now. At Facebook’s F8 event this week, Oculus showed off its new prototype, termed the “Half-Dome” headset, which takes the VR view from 110-degrees to 140.

Using the existing Oculus Rift and HTC Vive, the virtual world is displayed in front of your very eyes in gorgeous detail. But you don’t really want to look to the extreme left or right, as you’ll be staring at plastic and foam. The Vive Pro didn’t do anything to improve that. While HTC did make the virtual world more detailed with higher-resolution displays, Oculus may be the first of the two companies to develop a headset with an expanded field of view.

Where the Vive Pro comes with the same lenses as the original Vive, Oculus’ prototype adds new, larger lenses to the design. That’s what enables the wider field of view which stretches into the wearer’s peripheral vision. In our experience, this wider field of view has a bigger effect on how immersive a VR world can feel. Nothing’s worse than the goggle-like confines of a headset surrounding the user’s view. The Half Dome’s aren’t as wide as Pimax’s crazy, 200-degree VR headset, but it’s a good start.

Better yet though, those new lenses are also mechanically-controlled varifocals. Think a fancy version of your grandparents’ glasses. Much like those lenses help them see near and far, Oculus’ new design would allow for various levels of focus throughout the visual plane. If you’re looking at an object up close, the lenses would refocus there and similarly so in the distance.

Oculus suggested it would use software and hand-tracking to facilitate this, though it seems likely that some measure of eye-tracking would also be involved.

In theory, such technology could also enable performance-saving measures such as foveated rendering, which renders only the section of the screen a user is looking at in the highest detail, leaving peripheral vision to be rendered to a lesser standard. That may be why Oculus opted for mechanically manipulated lenses, rather than software-driven field of view effects. Where any forced field-of-view rendering would require additional GPU power, mechanically altered lenses would have no such impact.

Although the Half-Dome headset is very much a prototype and no real indication of what any future-generation Oculus headset will look like, it is a welcome sight from the company that kickstarted the modern VR revolution.

Now all we need is for HTC and Oculus to steal from each other so that we get a VR headset with a wider field of view and a higher-resolution display. And with a wireless module! It’s not too much to ask, is it?

Editors’ Recommendations

  • Facebook teases new Oculus prototype with mechanical ‘varifocal’ lenses
  • Leap Motion’s prototype augmented reality headset includes hand tracking
  • Baidu-owned ‘Netflix of China’ jumps into VR with a 4K headset with 8K support
  • Bose’s new prototype AR glasses focus on what you hear, not what you see
  • Google’s upcoming OLED display for VR headsets may pack a 3182p resolution


4
May

A new A.I. can guess your personality type based on your eye movements


“The eyes … they never lie,” said noted philosopher Tony Montana in the gangster movie Scarface. While Montana chose to go down the drug-dealing and murdering route, however, had he been born 30 years later he could probably have had a promising career as a computer interface designer. At least, that’s the message we’re choosing to take away from a new project created by researchers in Australia and Germany. They developed an artificial intelligence that is able to predict a person’s personality type by looking into their eyes.

“Several previous works suggested that the way in which we move our eyes is modulated by who we are — by our personality,” Andreas Bulling, a professor from Germany’s Max Planck Institute for Informatics, told Digital Trends. “For example, studies reporting relationships between personality traits and eye movements suggest that people with similar traits tend to move their eyes in similar ways. Optimists, for example, spend less time inspecting negative emotional stimuli — [such as] skin cancer images — than pessimists. Individuals high in openness spend a longer time fixating and dwelling on locations when watching abstract animations.”

These insights are interesting, but the challenge for the researchers was figuring out a way to turn such observations into an artificial intelligence system. To do so, they turned to a deep learning A.I. to offer some help.

The researchers asked 42 students to wear an off-the-shelf head-mounted eye tracker as they ran errands. They also had the students’ personality types tested using established self-report questionnaires. With both the input (the eye data) and output (personality types) gathered, the A.I. was then able to work out the correlating factors linking the two.

“We found that we were able to reliably predict four of the big five personality traits — neuroticism, extraversion, agreeableness, conscientiousness — as well as perceptual curiosity only from eye movements,” Bulling continued.

While there are definitely potential ethical dilemmas involved (imagine what companies like the now-defunct Cambridge Analytica might have been able to do with this information), Bulling noted that there are plenty of positive applications, too.

“Robots and computers are currently socially ignorant and don’t adapt to the person’s non-verbal signals,” Bulling said. “When we talk, we see and react if the other person looks confused, angry, disinterested, distracted, and so on. Interactions with robots and computers will become more natural and efficacious if they were to adapt their interactions based on a person’s non-verbal signals.”

A paper describing the work was recently published in the journal Frontiers in Human Neuroscience.

Editors’ Recommendations

  • Baidu’s new A.I. can mimic your voice after listening to it for just one minute
  • Forget cloning dogs, A.I. is the real way to let your pooch live forever
  • How to delete your Facebook account
  • Machine learning? Neural networks? Here’s your guide to the many flavors of A.I.
  • Get your gaming on the go with the 25 best Android games


4
May

A new A.I. can guess your personality type based on your eye movements


“The eyes … they never lie,” said noted philosopher Tony Montana in the gangster movie Scarface. While Montana chose to go down the drug-dealing and murdering route, however, had he been born 30 years later he could probably have had a promising career as a computer interface designer. At least, that’s the message we’re choosing to take away from a new project created by researchers in Australia and Germany. They developed an artificial intelligence that is able to predict a person’s personality type by looking into their eyes.

“Several previous works suggested that the way in which we move our eyes is modulated by who we are — by our personality,” Andreas Bulling, a professor from Germany’s Max Planck Institute for Informatics, told Digital Trends. “For example, studies reporting relationships between personality traits and eye movements suggest that people with similar traits tend to move their eyes in similar ways. Optimists, for example, spend less time inspecting negative emotional stimuli — [such as] skin cancer images — than pessimists. Individuals high in openness spend a longer time fixating and dwelling on locations when watching abstract animations.”

These insights are interesting, but the challenge for the researchers was figuring out a way to turn such observations into an artificial intelligence system. To do so, they turned to a deep learning A.I. to offer some help.

The researchers asked 42 students to wear an off-the-shelf head-mounted eye tracker as they ran errands. They also had the students’ personality types tested using established self-report questionnaires. With both the input (the eye data) and output (personality types) gathered, the A.I. was then able to work out the correlating factors linking the two.

“We found that we were able to reliably predict four of the big five personality traits — neuroticism, extraversion, agreeableness, conscientiousness — as well as perceptual curiosity only from eye movements,” Bulling continued.

While there are definitely potential ethical dilemmas involved (imagine what companies like the now-defunct Cambridge Analytica might have been able to do with this information), Bulling noted that there are plenty of positive applications, too.

“Robots and computers are currently socially ignorant and don’t adapt to the person’s non-verbal signals,” Bulling said. “When we talk, we see and react if the other person looks confused, angry, disinterested, distracted, and so on. Interactions with robots and computers will become more natural and efficacious if they were to adapt their interactions based on a person’s non-verbal signals.”

A paper describing the work was recently published in the journal Frontiers in Human Neuroscience.

Editors’ Recommendations

  • Baidu’s new A.I. can mimic your voice after listening to it for just one minute
  • Forget cloning dogs, A.I. is the real way to let your pooch live forever
  • How to delete your Facebook account
  • Machine learning? Neural networks? Here’s your guide to the many flavors of A.I.
  • Get your gaming on the go with the 25 best Android games


4
May

A Snipping tool looks to be the star of Windows 10’s next release


Even though Microsoft recently made its latest build of Windows 10 available for download earlier this week, the company is already hard at work coming up with improvements for the next major update of Windows 10. The new Windows 10 Insider Preview build 17661 released on Thursday, May 3, reveals that Microsoft will make the Snipping tool a central feature for its next major Windows feature update. The preview is available for Insiders who opted for the Skip Ahead release and those in the Fast Ring.

“One of the loudest things we heard is that you want to be able to quickly snip and share a screenshot, and we’re making it happen! WIN + Shift + S will now bring up a snipping toolbar — snip a rectangle, something a bit more freeform, or full screen and it will go straight to your clipboard,” Windows Insider program lead Dona Sarkar revealed in a blog post detailing the new features. If you need more options, from your screen snip, a notification will also pop up allowing you to take your snip to the new standalone Screen Sketch app where you can annotate your screenshot.

The new screen-snipping experience and the new Screen Sketch app are part of Microsoft’s efforts to optimize Windows 10 “for sharing and make communicating visually with others quick and easy.” By moving Screen Sketch out from the Windows Ink Workspace and into its own separate app, Microsoft hopes to improve the slow of capturing screenshots in a multitasking work environment.

“It will now show up in the list when you press Alt + tab, you can set the window size to be your preference if you like multitasking, and it even supports multiple windows (and tabs, thanks to Sets!),” Sarkar noted.

In addition to the new keyboard shortcut for the screen snipping experience, Microsoft will also give users three additional ways to capture their screen with this Insider build.

First, you can use the button at the end of your Surface Pen to launch the screen-snipping experience directly by configuring your Surface Pen through your Windows pen settings. Second, even though it’s not enabled by default, you can also change your keyboard setting to launch the screen snipping experience with your Print Screen key. And third, you can use the quick action button inside Windows 10’s Action Center.

There are also other changes to Windows 10 that will be part of this build, including improvements to Focus assist while gaming, more support for the High Efficiency Image File Format, and security fixes. You can view the complete list of new features and improvements on Microsoft’s blog.

Editors’ Recommendations

  • Microsoft tests new privacy settings interface in latest Windows Insider build
  • Latest Windows 10 Insider build makes it easier to control your GPUs
  • New Windows 10 Skip Ahead build forces Mail app users to open links in Edge
  • Windows 10 ‘Lean’ shows up in a preview build for Windows Insiders
  • Code in latest Windows 10 preview hints at a Surface Phone


4
May

How to change your Twitter password and activate two-factor authentication


Keep those Tweets protected!

Twitter. Some people love it, some people hate it. No matter your take, you should ensure that your account is as safe as can be if you use the social network.

twitter-logo-pixel-2-hero.jpg?itok=NLVho

To keep your account as secure as can be, we recommend changing your password and enabling two-factor authentication if you don’t have it turned on already.

Without further ado, let’s get started.

Changing your password

From the Twitter app, tap on your profile photo near the top left
Tap Settings and privacy.
On this menu, tap Account.
Tap the Password tab under Log in and security.

Enter your current password followed by a new password and then type in the new one again to confirm it.

twitter-password-how-to-11.jpg?itok=dxJPtwitter-password-how-to-3.png?itok=c_ucatwitter-password-how-to-4.jpg?itok=CUvtW

After entering your info, you’ll see a small pop-up at the bottom of your screen letting you know your password has successfully been changed.

Make sure you create a password that’s strong with a mix of lowercase/uppercase letters, numbers, and special characters. If you use a password manager like 1Password or LastPass, make sure you update your Twitter info there, too.

Enabling two-factor authentication

Once your password is changed, it’s a good idea to turn on two-factor authentication for your account. With this enabled, you’ll get a security code on your phone each time you log into Twitter, and that code needs to be entered on the device you’re logging into your account on.

It’s a great way to add an extra layer of protection, and setting it up with Twitter is dead simple.

Tap your profile icon near the top left.
Tap Settings and privacy.
Tap Account -> Security -> Login verification -> Start.

Enter your password to verify your account and then tap the send code button to send a confirmation code to your phone.

twitter-password-how-to-5.png?itok=ZZ2XMtwitter-password-how-to-6.png?itok=rwFKatwitter-password-how-to-7.png?itok=oV-Gs

After entering your code, two-factor authentication will be enabled on your account.

However, we suggest taking one extra step. By default, Twitter sends its authentication codes via SMS. This is fine, but it’s far less secure compared to using a dedicated security app. To change this –

From the Login verification page, tap Mobile security app and then the Start button.

Verify your password and select Set up now.

twitter-password-how-to-8.png?itok=vJabqtwitter-password-how-to-9.png?itok=X6rf1

Twitter will then ask you which authentication app you want to use if you have multiple installed, but if you only have one, it’ll automatically take you to the one you use with a prompt to save your new key.

4
May

These three Philips LED SceneSwitch bulbs are just $14 right now


Light up the room.

This 3-pack of Philips LED A19 SceneSwitch Color Change Light Bulbs is $14.26 on Amazon. This pack normally sells for around $27 and has never dropped this low before. This is the first major price drop of the new year for these particular bulbs as well. The price has fluctuated a bit but anything around $14 to $15 is good.

These are not smart bulbs, but if that’s what you’re looking for we got you covered. You can get these multi-colored Philips Hue smart bulbs for $40.99 each or these four white smart bulbs for $43.

philips-led-switch-bulbs-2dcd.jpg?itok=V
While they may not have smart functionality, these bulbs can still change the atmosphere of the room with three settings that switch between daylight, soft white, and a warm glow. The SceneSwitch will remember your favorite setting when the switch is turned off for more than six seconds. These are flicker-free with natural lighting and Energy Star certification. If you don’t have a dimmer switch but still want some of that ability, these bulbs are perfect. They come with a five-year warranty.

See on Amazon

4
May

NPR and public radio group buy popular podcast app Pocket Casts


NPR, This American Life, WNYC Studios and WBEZ Chicago have teamed up to buy Pocket Casts, a cross-platform podcast app. The public radio outlets hope to improve the podcast discovery experience, help creators find new audiences and improve insights for producers.

Pocket Casts’ discover section currently highlights featured, trending and popular shows. The app also includes an Up Next function, which can automatically build a podcast queue for you.

Developer Shifty Jelly said in a blog post while it had received previous offers from other suitors, it had turned them all down. It accepted the group’s offer because “everything from their not-for-profit mission focus, to their unwavering belief that open and collaborative wins over closed walled gardens resonated deeply with us.” The developer hammered home the point that nothing is immediately changing in the app. Instead, it’s using its new resources to improve Pocket Casts.

Shifty Jelly’s team will continue working on the app, with former iHeartRadio and Clear Channel vice president Owen Grover joining as Pocket Casts CEO.

There’s no real indication that the NPR group will use Pocket Casts as their exclusive podcast home — they are in public radio, after all, and Pocket Casts is a paid app. NPR has its own app, NPR One, and still distributes its podcasts through other apps and platforms.

Shifty Jelly hinted at some future plans for Pocket Casts. It may be the case that some of NPR One’s features are rolled in, such as letting the app know when you like a story on its personalized stream and skipping the ones you’re less interested in. That might help Pocket Casts learn your preferences and suggest new shows in a more tailored fashion, perhaps through the Up Next queue.

At the very least, the move will help the public radio group have more of a dedicated home amid the podcast app diaspora. The members could harness the immense popularity of their shows to direct listeners to Pocket Casts, though NPR has been reluctant to promote its own podcasts and app.

Via: The Verge

Source: NPR, Shifty Jelly

4
May

Anheuser-Busch will haul beer in Nikola hydrogen-electric trucks


The rivalry between Nikola and Tesla is only getting hotter… figuratively speaking. Mere months after ordering 40 Tesla Semis, Anheuser-Busch has ordered “up to” 800 of Nikola’s hydrogen-electric semi-trucks to introduce them into its beer-carrying fleet starting in 2020. The deal should help Anheuser-Busch convert its entire long-haul roster to renewable-based trucks by 2025 and will be equivalent to taking over 13,000 cars off the road. As to why it’s not just relying on Tesla? One word: range.

The Nikola semis will travel between 500 to 1,200 miles on a full tank versus Tesla’s 500-mile maximum. Nikola’s system will depend on relatively rare hydrogen stations, but it’s counting on having 700 of the facilities in service by 2028. However underdeveloped hydrogen infrastructure is right now, it’s ultimately better-suited to Anheuser-Busch’s long-distance logistics than existing electric-only options.

Suffice it to say this is a big deal for Nikola, which doesn’t have as many high-profile customers as its nemesis. The move could spur further orders from companies that see Anheuser-Busch’s purchase as a sign of confidence. And the fierce competition is good for everyone — it promises a future where emissions-free cargo transportation is the norm.

Source: Anheuser-Busch

4
May

Samsung Pay offers cash back on purchases from select retailers


Samsung announced a new Cash Back program for those making certain purchases through Samsung Pay. Within the Samsung Pay app’s home screen, there’s a new Cash Back section and tapping it will lead you to a list of offers from various retailers. Once you select an offer and make a purchase with that merchant, you’ll get a percentage of your purchase price back. That money can then be used towards another purchase as long as it’s done through Samsung Pay. “Our mobile wallet strategy is all about offering more choices for consumers and more opportunities for merchants,” Samsung said in a statement. “Cash Back creates a new channel for merchants to reach and reward consumers who are looking for great deals.”

Samsung launched support for PayPal in its mobile wallet earlier this month. And in its announcement today it said users will also soon be able to purchase Samsung Rewards points. The company says that option will become available later this month.

Source: Samsung