Skip to content

Posts tagged ‘Google’

10
Nov

‘RunGunJumpGun’ is a damn near-perfect mobile game


Fast-paced, reaction-based, “twitch” games have always been my thing, but rarely have they ever been this simple. RunGunJumpGun blends the brutal level design of a twitch game, the accessibility of an automatic runner and one of the most intuitive control schemes ever conceived. I first played it in early September, just after it launched on Steam. Despite having just two inputs — shoot and jump — there’s an awful lot more to the game.

Ostensibly, it’s an automatic runner — think Canabalt or the upcoming Mario iOS game — but with a wealth of gameplay mechanics and ideas added on top. You’re always just running, gunning or jumping, of course, but through intelligent level design and a masterful difficulty curve, it stays fresh and taxing throughout its 120 levels.

Today, RunGunJumpGun is out for iOS and Android, and it’s perhaps the most challenging, rewarding and downright fun mobile game of the year. Before its release, I spoke with the team behind the title, ThirtyThree Games, to find out how they managed to get so much game out of just two buttons.

“We weren’t out to just make an infinite runner mobile game that’s run-of-the-mill,” said programmer Logan Gilmour. “We were hoping it would stand more among PC games than mobile games, but then play equally well on mobile.” ThirtyThree Games set out to emulate the rush of games like Super Meat Boy, VVVVVV and Hotline Miami, and they nailed it. The game has the same fast pace and “live, die, repeat” mentality, for sure. But its control scheme and structure make it a very different experience overall, and one that stands alone without the need for comparisons.

At its core, RunGunJumpGun is about balancing the two inputs. “Jumping,” in this game, is actually more like flying — your character aims their gun downwards and will ride upwards while you hold down the button. You also need to shoot enemies and obstacles in front of you, but as soon as you do, you start to lose altitude. Several times per second, you’ll be deciding which button to press, but you never move your fingers apart from to push down. One button, one finger.

Removing all the other controls completely strips away everything between you and the game. “It lets you fall into a trance, and that’s kind of a big thing for the game, getting people lost in it,” said music and story designer Jordan Bloemen. “[Players] aren’t focusing on what they’re trying to do with a controller, they’re just trying to manage two buttons … Beyond that everything can just kind of wash over them.”

“It lets you fall into a trance, and that’s kind of a big thing for the game, getting people lost in it.”

Stripping away controls has its issues, though. You’re removing a lot of the tools that gamers are typically given to overcome to the challenges placed in front of them. It’s easy for that to become annoying, but although you will certainly be frustrated by RunGunJumpGun at points — some levels had me dying maybe 30 times in a row — you’ll be frustrated at your lack of skill, not at the game itself.

That sense of fairness is key to twitch games. When one hit can kill, developers getting something wrong is difficult to stomach. Take Furi, a boss rush game released earlier this year. It’s generally superb, but there are several moments where it seems the game is unfair — maybe a parry timing is off, a hit box not quite right. As a result, I constantly put it down for weeks at a time in frustration. As mentioned, I struggled massively with some levels in RunGunJumpGun but I never once felt the urge to stop. I always knew it was my fault I was dying.

440550_20160828163414_1.png

I struggled massively with some levels, but I never once felt the urge to stop. I always knew it was my fault I was dying.

A lot of work went into making RunGunJumpGun, its levels and each second of gameplay, feel fair yet challenging. There are small things, like making levels “concave,” so your character can’t get caught in a cove and die, or ensuring that the automated movement “is always the speed you want to go at,” but the truly interesting tweaks are invisible.

ThirtyThree Games used analytics and testers to analyze every second of gameplay. “We let a lot of people play the game, and we could see these big spikes where everyone was dying,” explained Bloemen. The team then acted on that data in different ways. Some levels were simply reordered for a smoother difficulty curve, but others were changed on a second-by-second level. “We collected the actual position where every person died,” said Gilmour. “So we could see where everyone was being killed by one hazard, and then just take the hazard out.”

The team would iteratively re-order and smooth out the levels, then bring in a new group of testers that hadn’t played before and see what the new data looked like. Their own little live, die, repeat loop, as it were. The importance of curving difficulty, according to the team, is paramount. “Especially the first world, that’s kind of make-or-break, said Bloemen. “That’s where you’re going to piss someone off and they’re not going to play anymore.”

It’s tough to find a better example of a difficulty curve done right.

While the first world (there are three, each containing 40 levels) hooked me; the second made me fall in love. It’s there that the developers start throwing a bunch of new elements at you, and it’s tough to find a better example of a difficulty curve done right.

Take the first ten minutes or so of world two: It starts by introducing a new mechanic — screen warping, which allows you to fly out the top of the screen and appear at the bottom, or vice versa. Then, it asks you to use screen warping to navigate a complex level. Then, it makes you do that with pin-point accuracy — one false move and you’re dead. Finally, you’ve nailed it. Of course, before you have time to relax, turrets are added. Then force fields. Then spaceships that shoot at you. Finally, fire turrets — the barrage of new elements feels like it never ends.

Removing the deaths, the section amounts to maybe three minutes of gameplay. In that time, you’ll have learned and mastered multiple new mechanics and hazards. Although the deaths will come thick and thin, no single level transition is too challenging, But if you skipped any given minute, the leap in difficulty would be near insurmountable.

Later in the game, some of these new mechanics take a little longer to get used to. The addition of water in particular threw me off for a while, because the movement physics are completely different. The final few levels are also an exception, as the difficulty is pretty much just ratcheted up to 11. But the general curve, and the way new ideas are introduced, is nothing short of perfect.

While data obviously had huge impact on defining the game’s structure, it wasn’t always enough. The team had their own thoughts on how enjoyable or challenging each level was, and there’s not a linear line of difficulty from beginning to end. “It’s important to have a little bumpiness in that curve,” said Gilmour. “Sometimes when we bring in a new mechanic, we make the first version of that a little harder. But you overcome it, and then the next time it’s easier, and you get a nice win, it feels like you’re getting some mastery.”

It also helps that playing the game feels great. The pixel art is bright and easy to follow, while the EDM soundtrack mixes menacing bass with light melodies that reminded me of another twitch favorite of mine, Electronic Super Joy. Like ESJ, rather than taking itself seriously, RunGunJumpGun is filled with humor. Some of this comes through dialog — there’s a story told through one-liners before each level — but a lot is down to the game itself. I lost track of the number of times I fell into an obvious trap, or a spinning disc bounced up at just the right moment to kill me — there’s a deviousness to the level design that, when coupled with the quick and colorful restart animation, actually makes dying as funny as it is frustrating.

There’s a final piece to RunGunJumpGun I’ve neglected to mention, and it’s perhaps the thing that’ll keep you coming back: Atomiks, the game’s name for the 10 tokens scattered throughout each level. Taking the “Atomik path” will bring you closer to death than any other path through a level, essentially making it “the nastiest way to play,” according to Gilmour. A tone chimes when you collect an Atomik, increasing in pitch each time to form a satisfying musical scale.

They’re almost like false waypoints, tempting you off the safer path at every opportunity. But collecting Atomiks is also the way you unlock more worlds, and “completing” the game is collecting them all. The path to victory is littered with near-endless death.

I played RunGunJumpGun a lot on PC, and grew deeply attached to it. As such, I was a little worried about how the game would handle on mobile. There’s a tactile immediacy about hammering away on a keyboard that’s just missing from a phone or a tablet. But my concerns were unfounded. The simplicity of the layout — tap the left side of the screen for jump, the right side for gun — means that you don’t miss the tactile feedback too much. I do think the game controls a little better with a keyboard, but being able to play it anywhere more than makes up for that.

“Personally, my favorite way to play it is on iPad,” said Gilmour. “It’s killer, the screen is really responsive, and you’re holding this thing, it reminds me of playing a Game Boy when I was a kid.” I have to agree. It’s great to jump into for a couple of minutes at a time, or to completely zone out with for an hour. I’ve handed the game to a few friends, and even those that don’t typically enjoy twitch games had a good time.

RunGunJumpGun is out now for iOS and will be released imminently for Android, priced at $2.99. There’s been some talk of a PlayStation Vita port in the future, but that’s very much in the “research to see if it makes sense” phase, according to Bloemen. Oh, and if playing on a PC or Mac is more your speed, the price of the Steam version of the game is going to drop down to $2.99 temporarily as well. In case it wasn’t clear, whatever your platform of choice, I can’t recommend this game enough.

10
Nov

YouTube’s new VR app lands on Google Daydream first


We’ve known about Google’s plan for a dedicated YouTube VR app since Daydream was first announced back in May. Today, the version of the video hub arrives that’s optimized for virtual reality. There’s a catch: It’s only available on Google’s Daydream platform right now. This means that you’ll need a compatible phone, like the Pixel and the Daydream View headset and controller to start watching.

YouTube has supported VR videos for over a year, but now there’s an app to provide a more immersive viewing experience. The new software makes every video from the site viewable in a virtual reality setting. Of course, 360-degree footage and content that’s designed for VR will offer a more believable feel of “going inside,” but regular videos can be viewed in a new theater mode. Voice search is there to help you find exactly what you’re after and you’ll be able to sign in so any channels you subscribe to and your playlists are easily accessible.

Daydream View launches today, so the new YouTube VR app will be ready for action when your headset arrives. There’s also an NFL VR series that debuts Thanksgiving day on YouTube, so you’ll be able to watch that as well.

Source: YouTube Blog

10
Nov

Google’s Daydream View VR headset is promising, but just a start


It was only a matter of time until Google moved on from Cardboard and started taking virtual reality seriously. Say hello to the Daydream View, the company’s first mobile VR headset. Much like Samsung’s Gear VR, it’s powered by Android. But the big difference is that this $79 headset will work across a wide variety of Android phones that support Google’s Daydream platform; it won’t just be stuck on Samsung’s hardware. It also stands out from the competition with a more comfortable design made from cloth instead of plastic. There’s a lot riding on this headset and Daydream in general, but can Google really compete with VR companies that have been developing hardware for years? For the most part, it turns out it can.

Hardware

There are so many mobile VR headsets out there now that they’re starting to feel a bit boring. The Gear VR, built by Samsung in collaboration with Oculus, set the stage with its original design. It turns out when you’re doing mobile VR, you really just need a comfortable way to hold the phone to your face and some decent lenses to refocus the screen. The Daydream View changes up the formula a bit, though, with its soft cloth-covered case. That might seem a bit odd at first, but it makes a lot of sense. VR headsets are basically wearables, and fabric simply feels more comfortable than plastic. (We saw something similar with the PlayStation VR, which uses a soft cloth material around its eyepiece.)

Hooking up a phone to the Daydream View is also simpler than any other headset I’ve seen. You just need to open up the front latch, drop the phone in with the screen facing the lenses and then close up everything and secure it with an elastic band on the top. It might look a bit clunky, but it’s fairly secure. After that, tighten the headband, slip the Daydream View over your head, and readjust as necessary. Instead of using velcro straps, Google’s headset relies on a band that’s more like a messenger bag strap. It’s fairly comfortable, but adjusting it is a bit tougher than simply dealing with velcro.

Keeping the theme of comfort going, the eyepiece is also made out of a soft and cushiony material. Best of all, you can actually remove the eyepiece for hand washing. Which is a good thing: Based on my experiences with other headsets, you can bet it’s going to get sweaty and grimy quickly. Since it’s relying on fabric on top of a plastic frame, the Daydream View comes in at a feather-light 220 grams (0.48 pounds). The Gear VR, on the other hand, weighs 345 grams (0.76 pounds).

Another way Google aims to differentiate itself is with the Daydream View’s motion controller. Like a vastly simplified version of HTC’s and Oculus’s remotes, you use it to navigate around Daydream’s interface, play games and interact with apps. The motion tracking is generally pretty accurate, though I noticed some issues as its battery life drained down. The remote’s simple layout — touchpad on top, an app-specific button in the middle, a home button on the bottom, and volume controls on the sides — also make it easy to use while your eyes are covered. It charges over USB-C too, which is a nice touch since your Daydream device will likely charge that way as well.

Speaking of compatible phones, for now you can choose from Google’s Pixel or Pixel XL to power the Daydream View. Other manufacturers are currently working on their own entries, though there’s nothing you can actually buy yet. Google says Daydream phones will generally offer high-resolution displays (you can bet they’ll likely be AMOLED, since that works best for VR); “high-fidelity sensors” for head tracking; and “ultra smooth” graphics. You’ll also want to pay attention to resolution differences between Daydream devices. The Pixel XL has a 1,440p (2K) display, for example, while the smaller Pixel has a less impressive 1080p display. In general, the more pixels you can stuff into a screen the sharper your VR experience will be.

In use

I tested the Daydream View with the Pixel XL, likely because Google wanted to show off its VR headset in the best possible light. Since the XL is such a large phone though, it sticks out a bit when it’s attached to the Daydream View. It still fit just fine, but the setup looks a bit unpolished. (Then again, the Gear VR looks even worse with a phone plugged in.) On the bright side, the Daydream View’s single elastic band did a fine job of holding the Pixel XL in place, even when I shook the headset like crazy.

Once you launch the Daydream app and slap the phone into a headset, you’re presented with a fairly typical home screen. It features recently used apps and your own shortcuts up front, and a button on the bottom of the screen leads to your entire library. At launch, Google has a handful of its own VR apps to explore: With YouTube VR, you can view normal videos on a flat or curved plane, or dive right into immersive 360-degree videos. Street View lets you take virtual strolls around famous locations. And Play Movies allows you to you use the Daydream View like your own personal home theater. Third-party apps include the Wall Street Journal, Star Chart VR and games like Mekorama and Hunter’s Gate.

Quality-wise, VR experiences in Google’s headset look and feel just as good as the Gear VR. I had a blast sifting through 360-degree YouTube videos. And I’m pretty sure Mekorama could end up becoming a killer app for the platform. It tasks you with moving a robot around a small 3D space (similar to the hit mobile game Monument Valley), but being able to play it in virtual reality makes it truly addictive. The Wall Street Journal’s app places you in an expensive Midtown NYC apartment, where you can explore its VR content, watch videos, and, for some reason, read articles. (I’ll save words for my boring old non-VR screens, thank you very much.)

As with most VR headsets, games looked better than interactive videos, with sharp graphics and no noticeable slowdown in the apps I tested. Videos looked fine, but they’re still mostly held back by the lack of high-quality VR cameras on the market. And while I’m sure some people will enjoy watching traditional 2D videos in VR, that’s something I only find valuable when I’m stuck in a boring hotel room. The Pixel XL also warmed up quite a bit after my virtual reality sessions, so you should definitely keep battery life in mind. (In general, it burned through around 20 percent of battery life for every hour I played.)

The competition

If it isn’t abundantly clear by now, Daydream View is going squarely against the $100 Gear VR. I’m still a big fan of that headset, and if you’re a Samsung phone owner, it’s your only option. But moving forward, the mobile VR landscape is going to get more complicated. If you want the freedom to chose between different phones, rather than just Samsung’s, then you’re better off investing in the Daydream ecosystem.

Daydream’s big problem at the moment is its small selection of apps. There simply isn’t that much available on Google Play for the platform yet. Google says that’s going to change by year’s end, though, with the addition of Netflix, Hulu, and the New York Times, along with plenty of other apps. This is one area where the Gear VR has a big head start, since it’s been around for years and has a lot of content help from Oculus and Facebook. Still, Google is flexing its brand muscle a bit; it already has an exclusive VR experience for Fantastic Beasts and Where to Find Them, the upcoming film set in the Harry Potter universe.

Wrap-up

While the Daydream View doesn’t completely reinvent mobile VR, it’s a solid first step for Google. It’s ideal for testing the waters of virtual reality without being locked into Samsung’s ecosystem. But its success depends on more Daydream phones being released, consumers being willing to pay for a headset and developers jumping on the platform.

10
Nov

What the world searched on Google after the US elections


Based on the search strings that trended for November 9th, the world turned to Google in an effort to understand the President-elect’s surprise win and the United States’ complicated voting system. Google Trends posted the most popular searches after Donald Trump was named the 48th President of the United States on Twitter, and as Mashable noted, it reflects a lot of people’s confusion.

As you would expect, the candidates’ names are linked to election-related searches, such as “What will Hillary Clinton do now?” and “How did Donald Trump win?” But even generic search strings like “How did…” and “Why did…” were dominated by the presidential elections. “Why did Hillary concede?” trended, as well as the question that probably plagued a lot of people’s minds: “How did the polls get it so wrong?” Across the pond, people also looked up what Trump means for Brexit.

We embedded some of Google’s top search trends below, but you can check out Google’s World POTUS website for more search data. As for what results come up when you look up these questions, we’re afraid you’re going to have to Google them yourself.

“What does Trump mean for Brexit?” was among the top questions on #Brexit in the hours following @realDonaldTrump’s #USElection2016 win pic.twitter.com/wmb7JERu5M

— GoogleTrends (@GoogleTrends) November 9, 2016

“Who won the #popular vote?” Top questions on Google today#USElection2016 pic.twitter.com/l1nGjbldb8

— GoogleTrends (@GoogleTrends) November 9, 2016

Top searched “why did…?” questions today on Google#Elections2016 pic.twitter.com/95OjQHPieg

— GoogleTrends (@GoogleTrends) November 9, 2016

“How did Donald Trump win?” Top questions on Google outside the US today#USElection2016 pic.twitter.com/JgkYcfGBkZ

— GoogleTrends (@GoogleTrends) November 9, 2016

“What will @HillaryClinton do now?” Top questions on Google today#Elections2016 pic.twitter.com/THPyLc1y9I

— GoogleTrends (@GoogleTrends) November 9, 2016

How @realDonaldTrump won the election in search#dataviz #Election2016 https://t.co/1AAVYl1QPm pic.twitter.com/x1A26yRhaL

— GoogleTrends (@GoogleTrends) November 9, 2016

Via: Mashable

Source: Google Trends (Twitter)

10
Nov

Google machine learning can protect endangered sea cows


It’s one thing to track endangered animals on land, but it’s another to follow them when they’re in the water. How do you spot individual critters when all you have are large-scale aerial photos? Google might just help. Queensland University researchers have used Google’s TensorFlow machine learning to create a detector that automatically spots sea cows in ocean images. Instead of making people spend ages coming through tens of thousands of photos, the team just has to feed photos through an image recognition system that knows to look for the cows’ telltale body shapes.

Like most current machine learning experiments, this isn’t completely accurate. An initial version could spot 80 percent of the sea cows that had been confirmed in existing photos. If performance improves enough, however, it would be much easier for scientists to both measure the size of endangered sea mammal populations and track their movement patterns. That, in turn, could lead to more targeted conservation efforts that could save a given species from extinction.

Source: Google

10
Nov

Google will retire Map Maker, the tool that let anyone update Maps


Google’s Map Maker tool was released in April 2011 as a parallel version of its official Maps app that encouraged user-submitted geographical and business changes, essentially crowdsourcing updates. Ideally, moderators would check the edits and roll them into the official versions once confirmed, though some trolling vandalism has squeaked through in the past. But it seems the dream to trust everyone with public maps is dead. Yesterday, the search giant announced that they will retire Map Maker and fold it wholly into Maps by March 2017.

Starting today, edits made in Map Maker won’t be up for moderation, Google stated in a blog post. Instead, they’ll be fully moved into Maps, . This begins a slow process for the search giant to transfer most, but probably not all of, the tools from Map Maker over to Maps leading up to and after its transition date next spring. What aspects won’t make the jump are unclear, but restricting edits and suggestions to its Local Guides program might prevent another vulgar joke edit from shutting down the public-facing site for a few weeks.

Via: SearchEngineLand

Source: Google Maps blog

9
Nov

Google’s mini radar can identify virtually any object


Google’s Project Soli radar technology is useful for much more than controlling your smartwatch with gestures. University of St. Andrews scientists have used the Soli developer kit to create RadarCat, a device that identifies many kinds of objects just by getting close enough. Thanks to machine learning, it can not only identify different materials, such as air or steel, but specific items. It’ll know if it’s touching an apple or an orange, an empty glass versus one full of water, or individual body parts.

It doesn’t take much to realize that the potential for computing breakthroughs is significant. Your phone could perform different actions depending on how and where you hold it. You might get a different interface if you’re wearing gloves, for instance. A restaurant would know to provide a refill the moment your drink is empty, and the blind could identify products in a store. It could be particularly useful for automatic sorting in farms and waste facilities, as well. The biggest obstacle is translating RadarCat from a clever concept to a practical product — that could take a while.

Via: FastCo Design

Source: University of St. Andrews

9
Nov

Google doesn’t want proprietary fast charging in Android phones


Google isn’t a fan of non-standard approaches to fast-charging Android phones over USB-C, and it’s bent on having manufacturers fall in line. Its newest Android Compatibility Definition document (for Android Nougat) now says it’s “strongly recommended” that device makers don’t support proprietary charging technology that modifies voltages beyond standard levels, or otherwise creates “interoperability issues” with standard USB charging. In other words, tech like Qualcomm’s Quick Charge 3.0 is likely considered naughty. On top of that, the company warns that later versions of Android might even require full interoperability with standard chargers.

This doesn’t mean that you won’t see fast charging. Remember, both of Google’s Pixel phones can top up quickly. However, it’s evident that Google would like to fulfill USB-C’s promise of cables and chargers that always work together. It doesn’t like the idea that you might have to carry a specific charger for your phone to work as expected, or that a flaky cable might fry your charger, phone or both. The company might never force vendors to drop their preferred fast charging standards, but it certainly won’t look kindly on them.

Via: Android Police, Phandroid

Source: Google (PDF)

9
Nov

Google slaps ‘repeat offender’ tag on unsafe sites


Google is closing a loophole in its Safe Browsing search policy. While it already flags sites that violate its malware, phishing and other policies, bad actors can temporary halt those activities. Then, once the warnings are removed, they resume, and unsuspecting searchers are none the wiser. Starting today, however, Google is flagging such sites as “repeat offenders,” and webmasters won’t be able to appeal the warnings for 30 days.

Hacked websites will not be classified as repeat offenders; “only sites that purposefully post harmful content will be subject to the policy,” Google notes. There’s nothing stopping you from clicking on a link anyway, of course, but a Google Search warning will no doubt dissuade a lot of users. In addition, Google Chrome will put up another warning page that will probably convince the majority of users to not enter a dangerous site.

Google’s Safe Search no doubt stops a lot of hacking, and the new policy will help. What’s really needed is a way to stop or limit the damage from email phishing attacks, however. Those have caused some of the largest breeches on the internet, reportedly including the hack of Clinton campaign chair John Podesta.

Source: Google

9
Nov

Australians researchers have built a better qubit


Qubits, the unit of information used by quantum computers, make use of a phenomenon known as “superposition” wherein they can exist in two separate quantum states simultaneously. Theoretically, they’d enable computers to perform a variety of tasks far faster than conventional desktops by performing simultaneous computations in parallel. The problem is that qubits tend to be very unstable which prevents the information the contain from being read. However, a team of researchers from the University of New South Wales (UNSW) in Australia may have finally tamed the elusive qubit. They’ve coerced one into remaining stable for ten times as long as normal qubits.

“We have created a new quantum bit where the spin of a single electron is merged together with a strong electromagnetic field,” Arne Laucht, a Research Fellow at UNSW, said in a statement. “This quantum bit is more versatile and more long-lived than the electron alone, and will allow us to build more reliable quantum computers.”

These “dressed qubits” (in that they’re “dressed” by the electromagnetic field) are able to retain information far longer than the standard “spin” qubit. And the longer the qubit can hold onto that information, the more powerful the computation researchers can make. The UNSW researchers create these custom qubits by blasting an electron’s spin with a continuously-oscillating magnetic field at microwave frequencies. Changing the frequency of the field adjusts the electron’s spin, much the same way that sound travels over FM radio.

This advancement could finally make quantum computers actually useful and bring them mainstream. Though that’s not from a lack of interest. Intel and Google are both already working on quantum computer designs of their own.

Via: The Manufacturer

Source: UNSW