Skip to content

Archive for

27
Jan

Google adds chat-like commenting to Sheets and Slides on your phone


Over the last year or so, Google has put a focus on making the phone and tablet versions of its productivity suite up to par with the web versions. The latest new feature Google’s rolling out around that mission is a revamped commenting experience in Docs, Sheets and Slides for Android and iOS. For starters, Google is adding mobile commenting to the latter two apps — it launched in Docs last year, but was absent from Sheets and Slides until today.

The actual experience of using comments on your mobile device has been revamped a bit, as well. It’s a lot more like chat, something that makes sense for a mobile doc. When you’re collaborating and make a new comment, it’s easier to add people that you’ve shared the document with (or people in your organization, if you’re using Google Apps at work). Just start typing a name and it’ll auto-complete with the email address of whoever you want to add to the comment thread. It’s nothing groundbreaking, but given how much competition Google is getting from Microsoft on the mobile documents front, anything it can do to make the experience better for users is a smart move.

Source: Google

27
Jan

The future of entertainment’s taking shape on a flying whale


When Alex McDowell tells me he’s considering using virtual reality as “a new kind of literacy,” as a way to educate using real science, it’s clear that I’m dealing with a visionary. We’re sitting beside The Leviathan Project, his “research project” that’s taking temporary residence at the Sundance Film Festival’s New Frontier exhibit, and dissecting the shifting parameters that define this brave new media world.

McDowell’s a film industry veteran who’s worked on production design with the bold-faced names that’ve directed some of cinema’s most unforgettable blockbusters. From the likes of David Fincher with Fight Club to Terry Gilliam with Fear and Loathing in Las Vegas and Steven Spielberg’s Minority Report, McDowell’s had a hand in guiding our imagination and steering our conception of the future for several decades.

At the University of Southern California, where he is a member of the faculty, McDowell heads up the World Building Media Lab, an academic effort that unites experts and students alike from across several disciplines (e.g., game design, sound design, improvisational theater, engineering, cinematography) to pioneer the art of storytelling in what he calls the “post-cinematic” world.

The ‘hacked’ Oculus Rift headset used for The Leviathan Project.

It’s a heady term, for sure, but what McDowell is referencing is the new, immersive space that technologies like virtual reality and augmented reality have engendered for creatives. The combination of those two mediums has required a rethink of how narrative is constructed — “You gotta get more and more cross-disciplinary in your radar. And I have to learn the basis of a lot more languages to be able to design in this space,” he says — and how the audience engagement goes from passive to active. McDowell is, quite simply, helping to redefine entertainment and, someday, by extension, education.

“There’s a handful of people that can go down to two or three thousand feet in a submersible, but that’s about it,” says McDowell, explaining the basis of a potential edutainment project his team’s considering. “In VR, we could put millions of people down in the depths of the oceans. And if you’re giving them real-world, science-based data that’s informing that world space, then they’re getting that real knowledge in a way that’s different than watching a documentary film.”

The undersea world McDowell plans to create doesn’t borrow from any of the fantasy elements that define the flying whale research lab of Leviathan. Instead, he says this potential project would incorporate real-world marine life, and show the impact of pollution and coral bleaching that’s resulted from rising temperatures. The aim being to provide viewers with firsthand knowledge and experience of the deep seas previously reserved for multi-millionaires like James Cameron.

Sensors placed on the backs of hands and nearby objects allow for interaction between the physical and virtual worlds.

But first, McDowell and his team at USC have to finish iterating on Leviathan, currently in its second incarnation, which his team brought to Sundance after three months of breakneck work specifically to gain insight on audience engagement. And based on early feedback, he’s already discovered several areas where the team could refine the experience, like eliminating the artificial time limit to create a hybrid squid-like flying creature known as Huxley, or reducing the excessive narration that guides you through it.

“It’s a failure of the state that we’re in that you need … that idea of a directed narrative. If the space is working, you shouldn’t need anything. And I think we’re a little bit in the stage of people don’t know enough about how to behave in these kind of [VR] spaces that we’re preemptively thinking that you need some sort of instruction,” says McDowell.

The Leviathan Project is an interactive installation, adapted from the novels by Scott Westerfeld. In McDowell’s interpretation, you set up shop inside a lab situated in the belly of a flying whale that’s en route from London to Moscow in 1895, and tinker with genetics. The project has a twofold purpose: Using a “hacked” Oculus Rift, viewers go on a task-based journey through VR which incorporates haptic interaction (i.e., you can pick up and manipulate physical objects with virtual consequences). After which, they can then also view the creatures from the experience in the real world, using an AR-enabled tablet powered by Intel’s RealSense, depth-sensing camera.

“In VR, we could put millions of people down in the depths of the oceans. And if you’re giving them real-world, science-based data, then they’re getting that real knowledge in a way that’s different than watching a documentary film.”

McDowell linked up with Intel around the time his World Building Media Lab was established in 2012. The company, which had secured the rights to Westerfeld’s novels, wanted to begin experimenting with emerging technologies.

“Intel was saying, ‘Taking on board the fact that cinema and theater and game and film and TV and all those things exist, and they’re not going away, how might we weave all of that together into a new semantic workflow platform, a creative process for the post-cinematic?’” he explains. “Whatever that may be. And VR and AR were the provocations.”

Intel’s RealSense-enabled tablet lets the Leviathan fly through the physical world.

Though several other literary properties, like the sci-fi works of China Miéville, were considered, McDowell and Intel settled on Leviathan because of the “self-contained ecosystem” the whale represented and the opportunities it afforded to the “evolution of narrative.” Eventually, they brought the project to life at the Consumer Electronics Show in 2014 as part of Intel’s CEO keynote.

“We flew an 80-foot whale off the screen and over 5,000 people,” he says of that early AR effort. “We kind of understood that you could get a real audience engagement out of this sort of experience. And then we developed that into apps and started thinking about how you could engage this little ‘Huxley’ engagement on the tablet.”

The apps McDowell’s referring to are live on the Google Play Store and iTunes. It’s an offshoot of The Leviathan Project that lets users interact with a Huxley, and take photos of it blending into the real-world environment. But before you rush to download it, be aware that the app requires a special password or scanned AR marker to activate. So far, these were only made available to attendees of Sundance’s New Frontier exhibit.

A ‘Huxley’ comes to life in The Leviathan Project

Though The Leviathan Project, with its AR and VR components, reads like an attractive entertainment option for consumers, it’s actually not at all intended for commercial release. It’s all part of a grand push-and-pull experiment McDowell believes will help shape storytelling, as well as the technologies — and the companies designing those technologies — that are shaping our future.

“I think part of the transaction here with the technology companies is we are going to be able to give back empirical data to say ‘The audience prefers this kind of engagement’ or ‘We got great feedback from this,’” says McDowell. “And hopefully that shifts the way some of the technology evolves. That’s a big part of our job as artists: to say the emotional needs of this story are this, and therefore the technology has to kind of adapt. As much as the technology triggers all sorts of tools for us that we wouldn’t have otherwise, it’s a very symbiotic relationship now.”

27
Jan

Google’s AI is the first to defeat a Go champion


Google’s DeepMind division has pulled off an impressive milestone. It’s AI has beaten a top ranked Go player five matches to zero. While computers winning chess matches against professional players has been old hat for a while, the computational power needed to master the Chinese game is astronomical. According to Google, there are more possible moves in a game of Go than there are atoms in the universe.

The company built a system called AlphaGo just to tackle the game’s nearly infinite possibilities. Instead of just trying to determine all the possible combinations of a game like it would with chess, the team feed the system’s neural network 30 million moves from professional players then had it learn how to create its own strategies by playing itself using a trail and error process called reinforcement learning.

All that training took up huge amounts of processing power and had to be offloaded to the Google Cloud Platform.

It then invited reigning three-time European Go champion Fan Hui to its office to play against AlphaGo. The computer defeated him. Google was quick to point out that beating a human at Go is, “just one rung on the ladder to solving artificial intelligence.”

AlphaGo is now slotted to take on world champion Lee Sedol in March.

Source: Google

27
Jan

‘The Witness’ causes motion sickness in some players


Some first-person games — including Skyrim, Fallout 4 and The Talos Principle — trigger motion sickness and nausea in certain players, and The Witness is the latest addition to this list. The Witness is Braid creator Jonathan Blow’s latest game, and it’s a vast, introspective puzzler populated with pastels and intricate, physical riddles. While the puzzles are designed to induce figurative headaches, some players have taken to the game’s Steam page, NeoGAF and Reddit complaining of queasiness and dizziness while playing.

“So, really interesting game, but I can’t play it for long or I feel ill,” Steam player theegravedigger explains. “I think it’s a motion sickness/FOV thing, but I haven’t had this experience since Wolfenstien.”

Blow is aware of the issue and he says in a tweet that he’ll implement a potential fix “really soon.”

Motion sickness is a tricky problem. On Reddit, one player said that both The Witness and The Talos Principle (a similarly puzzle-filled, first-person experience) induced nausea. Another player reported no issues with The Witness, even though The Talos Principle gave that person “so much motion sickness.”

The Talos Principle, which came out in 2014 and shares some gameplay aspects with The Witness, has since been updated with a “motion sickness” game mode. It appears to work well for many players, though there are still some nauseated outliers.

Via: Geek

Source: NeoGAF, Steam, Reddit

27
Jan

Aftershokz Trekz Titanium bone conducting headphones: review


Like many people I commute to work on a bicycle. One of my biggest problems, besides dodging inattentive drivers, is being able to listen to music while still being able to hear traffic around you. I’m pretty much stuck with one option and only one option. A Bluetooth speaker. Most speakers aren’t made to be mounted on a bike, and the speakers that are take up valuable space on your handlebars.

While I was at CES this year, I bumped into the folks at Aftershokz who showed me their lineup of bone conducting headphones and were kind enough to provide me with a pair of Aftershokz Trekz Titanium to review. I’ve pretty much worn these headphones every day since.

Aftershokz Trekz Titanium bone conducting headphones overview

Aftershockz utilizes a nontraditional method of delivering sound to your eardrums – bone conduction. Instead of creating sound through small speakers and directing it into your ear canal, Aftershokz Trekz Titanium sends vibrations directly into your ear by vibrating the soft bones in front of your ear. The benefit of utilizing this method is that you can still hear what’s going on around you.

The Trekz Titanium feature a titanium band that is almost impossible to bend out of shape and does a great job in adding to the portability and durability of the headphones. The titanium band also helps to minimize the Trekz Titanium’s footprint as the band is much smaller than their other bone conduction headphones.

Aftershokz Trekz Titanium
Aftershokz Trekz Titanium
Aftershokz Trekz Titanium

The Trekz Titanium feature just three buttons, two volume buttons, and a multipurpose button. The volume up button doubles as the power button, which frees up the multi-purpose button for other options, including initiating Google Now voice commands. It also can change the song being played as well as answer and hang up phone calls.

All this and they are also wireless, utilizing Bluetooth 4.0.

Aftershokz Trekz Titanium bone conducting headphones setup

The setup procedure is fairly easy. All you need to do to pair the phone is turn the headphones on by holding down the volume up / power button and keep it depressed until the headphones announce that they are in pairing mode. From there you can search for them from your phone and pair them to your device.

Aftershokz Trekz Titanium bone conducting headphones usage

Aftershokz Trekz TitaniumThe Trekz Titanium headphones are very comfortable even when worn for long periods of time, and because they don’t block your ear canal, they are easy to leave on all the time. After a while, there was a slight discomfort that I felt from wearing them all day, but it wasn’t anywhere near the point of needing to remove them.


http://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js

(adsbygoogle = window.adsbygoogle || []).push();

The battery life is excellent. They lasted me, on average, 2 days of normal use before I needed to charge them.

The sound quality was also pretty good. Because the headphones don’t go directly into the ear, the bass response is less than other on-ear or in-ear options. Another downside is that these headphones are not suited for listening to music at high levels. This detracts from being able to be immersed in your music. It seems that Aftershokz is aware of this, and they provide a pair of memory foam ear plugs to help. It does help, and it’s the only way to listen to the headphones at full volume. If not, the vibrations will cause the small hairs inside your ear to move back and forth. It will tickle the insides of your ear to the point where you won’t want to continue listening at that volume level.

This never happened at low to medium volume levels. In fact, low to medium volume levels are where these headphones excel. They allow you to be aware of what’s going on around you while still enjoying your music. I bike to work every day and being able to hear the traffic around me is extremely important so as to avoid any kind of accident.

One downside is that when you move your head from left to right it’s easy for the headphones to shift on your head. This is further complicated if you’re wearing a helmet or high-collared jacket. It’s easy enough to readjust, but it can be a little annoying when you’re having to do it several times during your commute.

Another area where I used the headphones was at work. I like to listen to music, but I also need tobe able to hear my desk phone ring. It’s also nice to carry on a conversation while you have music playing on your headphones.

Aftershokz Trekz TitaniumIn reality, the headphones normally sound like ambient sound. They don’t sound like they are directly beaming the music into your ears. They sound more like you’re listening to music on speaker somewhere in the room.

Another negative that I found was that the microphone was a little finicky at times. It seems like if I leaned back in my chair, brought my hands above my head, or any other change in position could compromise the person’s ability on the other end of the call to hear me. This was frustrating at first, but after I figured it out, I just made sure to maintain a proper posture during my phone conversations.

4.16 out of 5 stars

Overall these headphones are really great. They aren’t a replacement for your super awesome over the ear headphones, but they aren’t meant to be. Instead, they bring something else to the table. They bring the ability to listen to music without having to go deaf to the world around you. That is the main reason why these are now my go-to headphones. They sell for $129 on Aftershokz.com. What do you think? Are you going to give them a try?

http://ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ss&ref=ss_til&ad_type=product_link&tracking_id=andrguys07-20&marketplace=amazon&region=US&placement=B00JO9Y176&asins=B00JO9Y176&linkId=M5KWT62XM2RHQDRK&show_border=true&link_opens_in_new_window=truehttp://ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ss&ref=ss_til&ad_type=product_link&tracking_id=andrguys07-20&marketplace=amazon&region=US&placement=B00JO9XTSI&asins=B00JO9XTSI&linkId=5TUELPRCXSDEZNF4&show_border=true&link_opens_in_new_window=true

The post Aftershokz Trekz Titanium bone conducting headphones: review appeared first on AndroidGuys.

27
Jan

Apple Seeds Second iOS 9.3 Beta to Public Beta Testers With Night Shift Control Center Toggle


ios93Apple today released the second beta of an upcoming iOS 9.3 update for public beta testers, just a few days after seeding the second iOS 9.3 beta to developers. iOS 9.3’s second public beta comes a week after Apple released iOS 9.2.1, a minor update, to the public.

Beta testers who have signed up for Apple’s beta testing program will receive the second iOS 9.3 update over-the-air after installing the proper certificate on their iOS device.

Those who want to be a part of Apple’s beta testing program can sign up to participate through the beta testing website, which gives users access to both iOS and OS X betas.

Subscribe to the MacRumors YouTube channel for more videos.

iOS 9.3 is a major update to the iOS 9 operating system, introducing a long list of new features and improvements. iOS 9.3’s biggest new feature is Night Shift mode, which is designed to automatically cut down on the amount of blue light an iOS user is exposed to at night by shifting to more yellow tones for the iPhone or iPad’s display. With iOS 9.3, there’s a number of changes for educational users, and the iPhone is now able to pair with multiple Apple Watches.

The update also includes new 3D Touch Quick Actions for stock apps like Weather, Settings, Compass, Health, App Store, and iTunes Store, plus it introduces password protection for individual notes in the Notes app. News in iOS 9.3 includes in-line video playback, landscape mode on the iPhone, and more personalization, while the Health app introduces a new Apple Watch-style “Activity” interface.

Today’s second beta includes support for a Night Shift toggle in the Control Center on iOS devices. Night Shift is denoted by a new eye-shaped icon in between the icons for the timer and the calculator on the iPhone. Tapping on the icon brings up options to turn the feature on or disable it until the next day.

Subscribe to the MacRumors YouTube channel for more videos.

A full list of changes in iOS 9.3 can be found in our “What’s New” post. iOS 9.3 will launch to the public this spring.

Related Roundup: iOS 9
Tag: iOS 9.3

Discuss this article in our forums

27
Jan

Apple Releases Second OS X 10.11.4 El Capitan Beta to Public Beta Testers


OS X El Capitan LogoApple today seeded the second beta of an upcoming OS X 10.11.4 beta to public beta testers, just a few days after releasing the second OS X 10.11.4 beta to developers and more than a month after releasing OS X 10.11.2.

The second beta is available through the Software Update mechanism in the Mac App Store for those who are enrolled in Apple’s beta testing program. Those wishing to join the program can sign up on Apple’s beta testing website.

OS X 10.11.4 has been provided to developers and public beta testers alongside OS X 10.11.3, which is also currently in testing. Apple has provided testers with two betas of OS X 10.11.3, and it could be released to the public shortly.

Both OS X 10.11.3 and OS X 10.11.4 appear to focus largely on under-the-hood bug fixes, security enhancements, and performance optimizations with few noticeable outward-facing changes. OS X 10.11.4 does support password protected notes in the Notes app, allowing a password to be assigned to individual notes, and it includes Live Photos support for the Messages app.

Related Roundup: OS X El Capitan
Tag: OS X 10.11.4

Discuss this article in our forums

27
Jan

Apple Releases Mac App Store Update for OS X Snow Leopard


macappstoreApple today released a minor update to OS X Snow Leopard with a refreshed version of the Mac App Store. According to Apple’s release notes, the Mac App Store has been updated to ensure the future compatibility of the app with the OS X Snow Leopard operating system.

Today’s Mac App Store update is available to all OS X Snow Leopard users and can be downloaded through the Mac App Store’s software update mechanism.

Before becoming available for public release, the OS X Snow Leopard Mac App Store compatibility update was made available to developers on January 20 for testing purposes.

OS X Snow Leopard (aka OS X 10.6) was first released in 2009. As software that has been discontinued and is run primarily on older machines, Snow Leopard updates are few and far between. The last significant Snow Leopard update, aside from security fixes, was introduced in 2011 ahead of the release of OS X Lion.

Apple’s Mac App Store first launched in 2011 as part of the OS X Snow Leopard 10.6.6 update.

Discuss this article in our forums

27
Jan

Dual Camera iPhone 7 Plus Could Offer ‘DSLR-Like’ Quality, 3D Depth Mapping


Earlier today, reputable KGI Securities analyst Ming-Chi Kuo said the iPhone 7 Plus will likely have a dual-lens camera system based on technology Apple acquired from LinX Imaging. The new hardware could lead to some significant improvements in camera quality on Apple’s next flagship smartphone.

LinX’s multi-aperture cameras pack impressive image quality in a smaller size than single aperture cameras, meaning the iPhone 7 Plus may lack a protruding camera lens and be able to take SLR-quality photos — think Canon or Nikon. The camera modules are also capable of very interesting technology called 3D depth mapping and more.

We previously provided an in-depth look at LinX’s camera modules after Apple acquired the company, but it is worth recapping some of the major advantages of their technology, given today’s iPhone 7 Plus rumor.

Noise Reduction

The images captured by the LinX camera are brighter and clearer, with significantly reduced noise levels, compared to smartphone cameras. Available detail when zoomed into a photo was also much greater, as can be seen in the comparison below. View this PDF for more side-by-side image comparisons.

linxlownoise

Improved Indoor Photos

In the photo below, the image was taken in mid-levels of light, at approximately 40 to 50 lux, similar to a decently well-lit room in a house or restaurant. The LinX sensor let in more light than the iPhone 5 or the Samsung Galaxy S4, for a photo that is clearer and sharper with less noise.

indoor40luxlightinglinx

Improved Low Light Photos

LinX technology is able to significantly improve low light performance by using multiple channels to increase the sensitivity of the camera for better detail. It also keeps exposure times short to cut down on the motion blurring that can impact photo quality in conditions where lighting is not optimal.

linxlowlighting1lux

LinX technology doesn’t have to compromise between pixel size and resolution, as it can use small pixels but still let in adequate amounts of light.

3D Depth Mapping

linxpointmapping

LinX’s multi-aperture cameras can create detailed depth maps of objects. With depth information on a per-pixel basis along with RGB information, LinX cameras can create 3D point clouds of objects from a single frame or a complete 3D model by combining several frames captured from different angles.

3D depth mapping has a number of useful applications, including 3D scanning of objects, sizing of objects, background removal and replacement and gesture recognition. The depth maps also allow for improved refocusing; by knowing the depth at every pixel, the feature allows for a synthetic blur to be applied that emulates a shallow depth.

Benefits Overview


– Better color accuracy and uniformity

– HDR – higher dynamic range

– UHDR – ultra high dynamic range

– Low noise levels

– Higher resolution


– Low module costs

– No Autofocus for modules of up to 20MP

– Zero shutter lag

– Small size allows for slim devices and edge-to-edge displays

Read A Look at LinX Camera Technology for more details about the dual camera systems.

Related Roundup: iPhone 7
Tag: cameras

Discuss this article in our forums

27
Jan

[TA Deals] This Complete Game Developer Course will teach you everything about making video games


game-developer-course

Have you ever wanted to become a Game Developer? It takes a lot of time and money to follow that path through traditional avenues of education. Thankfully, you can achieve your dream through a self-study course, called the Complete Game Developer Course over on Talk Android Deals!

Here’s just a peak at what exciting things you’ll be learning as a Game Developer:

  • Learn practical, employable skills w/ over 423 lectures & 34 hours of content
  • Start making games from scratch w/ Construct 2
  • Create 60 real-life games to add to your portfolio
  • Make art & game design in Photoshop
  • Watch the courses at any speed you want
  • Build a variety of different types of games
  • Learn from a professional game developer who has released over 40 games

While this certainly won’t help you master the topic, it’ll get you on the right path to creating your own games. And if you work at it enough, you might even have a bright future working at a studio somewhere as a developer.

This particular course is only going to cost you $39. It normally retails for $300, but it’s been discounted by 87% for a limited time only. With that in mind, you’ll want to act fast!

Anyone plan on picking this sweet deal up?

[Talk Android Deals]

Come comment on this article: [TA Deals] This Complete Game Developer Course will teach you everything about making video games