Skip to content

Archive for

8
Aug

Engadget giveaway: Win a OnePlus 5 smartphone and backpack!


OnePlus has been making its mark on the mobile scene for a while now and its latest phone doesn’t disappoint. The OnePlus 5 is a combination of features, style and price that hits the sweet spot, with the top-of-the-line version packing 8GB of RAM and 128GB of onboard storage for just $539. There’s a dual camera here, along with software providing bokeh control and the ability to fine-tune settings like ISO, shutter speed and more. A Qualcomm Snapdragon 835 processor runs the show and battery life is improved over the previous 3T model, even offering quick charging so you can have “a day’s power in half an hour.” This week, the company has provided us with two of it’s OnePlus 5 Midnight Black 128GB models, along with some swag for two lucky readers. All you need to do is head to the Rafflecopter widget below for up to three chances at winning one of these highly rated handsets. Good luck!

a Rafflecopter giveaway

  • Entries are handled through the Rafflecopter widget above. Comments are no longer accepted as valid methods of entry. You may enter without any obligation to social media accounts, though we may offer them as opportunities for extra entries. Your email address is required so we can get in touch with you if you win, but it will not be given to third parties.
  • Contest is open to all residents of the 50 states, the District of Columbia and Canada (excluding Quebec), 18 or older! Sorry, we don’t make this rule (we hate excluding anyone), so direct your anger at our lawyers and contest laws if you have to be mad.
  • Winners will be chosen randomly. Two (2) winners will each receive one (1) OnePlus 5 smartphone – 128GB, one (1) Sandstone protective case and one (1) OnePlus Travel backpack.
  • If you are chosen, you will be notified by email. Winners must respond within three days of being contacted. If you do not respond within that period, another winner will be chosen. Make sure that the account you use to enter the contest includes your real name and a contact email. We do not track any of this information for marketing or third-party purposes.
  • This unit is purely for promotional giveaway. Engadget and AOL are not held liable to honor warranties, exchanges or customer service.
  • The full list of rules, in all of its legalese glory, can be found here.
  • Entries can be submitted until August 9th at 11:59PM ET. Good luck!
8
Aug

Autonomous wheelchairs arrive at Japanese airport


Passengers with limited mobility will soon be able to navigate airports more easily thanks to Panasonic’s robotic electric wheelchair. Developed as part of a wider program to make Japan’s Haneda Airport more accessible to all, the wheelchair utilizes autonomous mobility technology: after users input their destination via smartphone the wheelchair will identify its position and select the best route to get there.

Multiple chairs can move in tandem which means families or groups can travel together, and after use, the chairs will ‘regroup’ automatically, reducing the workload for airport staff. The chairs also use sensors to stop automatically if they detect a potential collision.

The chairs will be tested between now and March 2018 alongside a number of other initiatives devised by Panasonic and NTT. Other programs include eliminating language barriers through smartphone object recognition technology (so just point your smartphone at a sign for a translation), reducing passenger congestion through crowd analysis technology and clearer intelligent audio signage for those with impaired vision.

Source: Panasonic

8
Aug

US agriculture agency tells staff not to mention climate change


It’s no secret that the Trump White House is no fan of climate change science, but that’s been having more of a chilling effect than you might think. The Guardian has obtained emails showing that the US Department of Agriculture’s farm conservation division, the Natural Resources Conservation Service, has been telling staff to avoid using language that directly references climate change. Instead of “climate change,” workers are told to refer to “weather extremes;” instead of talking about how the country can “reduce greenhouse gases,” they’re asked to talk about “build[ing] soil organic matter.”

The director issuing these statements, Bianca Moebius-Clune, stressed that this wouldn’t alter the scientific modeling, just how it’s discussed. Still, some in the organization aren’t happy. One staffer was worried that altering the language would jeopardize the “scientific integrity of the work.” That’s not without merit: shortly after Trump’s inauguration, NRCS deputy chief Jimmy Bramblett warned that air quality research relating to greenhouse gases “could be discontinued.”

That the current US administration would frown on the mere mention of climate change isn’t surprising. Just ask the EPA. And really, this is an extension of what Florida saw when Governor Rick Scott banned official talk of climate change in 2015 — the political leadership is pretending an issue doesn’t exist by refusing to say its name. However, the emails illustrate both how this censorship takes place and the extent to which it’s voluntary. In the USDA’s case, it’s likely a defensive tactic. The department wants to avoid incurring the White House’s wrath, which could put vital science initiatives in jeopardy.

Source: Guardian

8
Aug

The man who put us through password hell regrets everything


If you rue the inevitable day when IT makes you change your password, you’re not alone. It is incredibly frustrating to constantly think of new passwords with a capital letter, a special character and numbers that isn’t a variation on your old password. And it turns out that we’re pretty bad at it, which is why the man responsible for the password hell we’ve been in this past decade has recanted his recommendations.

Bill Burr, a manager at the National Institute of Standards and Technology (NIST), wrote a password primer in 2003 that recommended many of the rules we have now: special characters, capitals and numbers. He also added a recommendation that they be updated regularly (THANKS, BILL). It’s this document that has been the basis for the password policy that’s become prevalent among the government, businesses and other institutions. But now, the 72-year-old password guru tells The Wall Street Journal that, “Much of what I did I now regret.” So do we, Bill. So do we.

But there’s some good news: The NIST is currently overhauling these guidelines and they’ve just been finalized. One revised recommendation is that IT departments should only force a password change when there’s been some kind of security breach. Otherwise the changes we make are often incremental; when forced to switch out our passwords every 90 days, people tend to just swap out one character. That makes the bulk of passwords incredibly ineffective; this practice actually harms security rather than helping it.

Another recommendation is to favor long phrases, rather than short passwords with special characters. There should no longer be a requirement to have a certain mix of special characters, upper case letters and numbers for a password. It turns out that adding in these artificial password restrictions actually produced less secure passwords. Additionally (and unsurprisingly), the guidelines recommend screening passwords against commonly used passwords or ones that have been compromised.

You can read the full set of draft guidelines at NIST’s website, but this news should be music to the ears of anyone who’s struggled with passwords. While no organization is required to adopt these new measures, these types of recommendations are usually implemented as best practices for security.

Source: The Wall Street Journal, NIST Special Publication 800-63B

8
Aug

Spotify makes its long-awaited debut on Xbox One


Well, that didn’t take long. Less than a week after members of the Xbox crew were seen spinning tunes from Spotify on the console, the streaming service is making its official debut. Reports last week indicated that the switch would flip before the arrival of the Xbox One X, but it looks like Microsoft wanted to make the leap before summer’s end. Despite the long wait, the tie-in between Spotify and Microsoft is rather straightforward. Once you install the Spotify app on your Xbox One and input your account info, you’ll be able to soundtrack gaming sessions with the music of your choice.

As you might expect, there are curated playlists specifically picked for gaming. Listen to game soundtracks via “Epic Gaming,” get hyped for battle with “Power Gaming” and stream a selection of tunes from Xbox’s Major Nelson. The Xbox One app also works with Spotify Connect so you can use your phone or another device to control the music from the Spotify app when you’re in the middle of a game. The best part? The music service will be available on the console today, so you won’t have to wait to blast Katy Perry behind the wheel in Forza Horizon 3. It’s available for Spotify users in 34 counties, including North America and much of Europe.

While Spotify is just now hitting Xbox, it debuted as a PlayStation exclusive back in 2015, powering PlayStation Music with its massive catalog. The move to Xbox one follows SoundCloud, which hit the console at the end of May. Of course, Microsoft’s own Groove Music is already an option and likely another reason Spotify didn’t make an appearance on the console sooner. Both Pandora and iHeartRadio were available, too. Coupled with yesterday’s reveal of a new interface, today’s Spotify news makes quite the week for Xbox owners.

8
Aug

‘The Dog Whisperer’ curated an audiobook collection for your pup


Have you ever been listening to an audiobook and thought, “I love this, but I wish they had more offerings for dogs”? If so, you might be the only person, but Cesar Millan still wants to help you. The dog whisperer is teaming up with Audible to launch an audiobook service directed at your furry canine companion.

Audible for Dogs is designed for anxious pets that don’t like being left at home alone; Millan’s Dog Psychology Center reports that audiobooks are much more effective in relaxing dogs and reducing their stress than music. The idea is pet owners will leave one of these audiobooks playing when they leave for work in the morning; whether in a crate or roaming the house, supposedly these audiobooks will calm dogs and help them feel less alone.

You can start with the free Cesar Millan’s Guide to Audiobooks for Dogs, because apparently this whole program is complicated enough to require an entirely separate audio guide. The rest of the titles are handpicked by Millan and include a video introduction from the man himself detailing why exactly these books are good for dogs. Titles include Born a Crime by Trevor Noah, Pride and Prejudice by Jane Austen (these two definitely need an explanation because I’m not seeing the dog connection), The Art of Racing in the Rain by Garth Stein and A Dog’s Purpose by W. Bruce Cameron.

This isn’t a completely separate program; subscribers to Audible will be able to purchase these audiobooks, and anyone can access Millan’s videos for free on the website. It’s not clear what would prevent people from just buying regular, non-dog-selected audiobooks and playing them for Fido, but hey, I am not here to judge what you buy for your pet.

Source: BusinessWire

8
Aug

David Letterman is joining forces with Netflix for a new show


If you’ve wondered if David Letterman has been enjoying his retirement, well the answer is apparently no. It seems he’s been bored, which is why he’s struck a deal with Netflix for a brand-new show.

The series will consist of six hour-long episodes that feature a single guest in conversation with Letterman — a sort of long-form interview. However, he’ll also go out into the field to explore topics and issues away from a desk. The series does not yet have a name. Letterman said of the deal: “Here’s what I have learned, if you retire to spend more time with your family, check with your family first.” It’s scheduled to premiere in 2018.

This is just the latest announcement in a slew of big gets for Netflix, from a movie starring Sandra Bullock to yesterday’s acquisition of Mark Millar’s comic book empire. It’s difficult to not be impressed by how far the company has come from its DVDs-by-mail starting point. Netflix has become a major player in the TV and film landscape, and this newest get for them makes it clear they’re here to stay.

Source: Netflix

8
Aug

Instagram livestreamers can add a guest to their broadcasts


Livestreaming is becoming a major part of social networks — Instagram, Twitter (via Periscope) and Facebook have all been pushing it in our faces for a while now. They all work the same, more or less, but Instagram is adding an intriguing new feature to the mix. Today, the company announced that some users will be able to add a “guest” to their live broadcasts, essentially adding a second contributor to the livestream. This lets users have a live conversation with a friend and broadcast both sides of that chat to your followers.

Once you have a guest in your livestream, you can boot them out any time you want and add another, or the person you invited can also leave at any time. When you’re livestreaming with a guest, followers will just see the screen split 50/50 between what your camera is broadcasting and what the other person in the stream is shooting. For now, it seems you can only add one person to the stream; there’s no word on whether Instagram will let you add multiple guests, but it seems like that could get pretty complicated on a small screen.

Instagram says it is only testing out this feature with a “small percentage” of users for now, but it’ll roll out globally in the coming months.

Source: Instagram

8
Aug

Swarm redesign shows just how much it knows about you


Foursquare’s Swarm check-in app has catered to lifeloggers before, but never quite like this. The location-centric company is launching Swarm 5.0 for Android and iOS with your trip history as its focus. The check-in map is now at the very heart of the app, making it easy to see where you’ve been. You can share it, too, if you want to keep your friends up to date. There’s also a revamped profile that gives you a better view of your many achievements, whether it’s your ongoing check-in streaks or the number of unique places you’ve visited. You can even see places you have yet to mark on your ‘bucket list.’ It’s a bit disconcerting to realize how much info you’ve volunteered to Swarm, but this is a big help if you’re trying to remember the last time you visited a given restaurant.

This also promises to be a more social app… if you want it to be. You should have an easier time keeping up with your friends’ check-ins, and you can search for friends, visited categories and history. There is a change to Swarm’s longstanding stickers, though. Now, the only way to upgrade your stickers (and claim that sweet, sweet multiplier bonus) is to check in enough times to unlock higher levels.

Yes, this is yet another tweak to Swarm’s mission since its split from the Foursquare app. However, you could say that this is more honest. Lifelogging is one of the biggest reasons to use Swarm outside of the endless one-upmanship from its game component, and it’s now at the heart of the app. The sticker change is also helpful for Foursquare’s effort to serve businesses. Now that you can’t just spend coins to upgrade stickers, you’ll have more reasons to check in and provide valuable information about locations.

Source: Foursquare, App Store, Google Play

8
Aug

The real-time motion capture behind ‘Hellblade’


In a makeshift changing room filled with Disney Infinity figures, I strip down to my boxers and pull on a two-part Lycra suit. It feels tight, and the top half shimmies up toward my waistline as soon as I stretch or stand up straight. How anyone is able to act in this thing is a mystery to me. Sheepishly, I gather my belongings and trot back to the motion capture studio that sits at the end of Ninja Theory’s offices in Cambridge, England. Inside, a couple of engineers scurry about, prepping cameras and cables.

For years, movie and video game studios have used mocap to bring digital characters to life. From detective Cole Phelps in L.A. Noire to the powerful Caesar in Planet of the Apes, the technology has delivered some truly moving, actor-driven performances. Normally, however, motion capture scenes are processed by an animator hours, days or weeks after they’ve been captured on set. It’s a time-consuming process, and one that involves some guesswork. In a sparse, lifeless room, directors are forced to imagine how a take will look in the final sequence.

Not so with Ninja Theory. The video game developer has a unique setup that allows Chief Creative Director Tameem Antoniades and his team to preview scenes in real time. Pre-visualisation, or pre-vis, has existed before in the industry, but it’s typically limited to body tracking. Full-character modelling is rare, especially at the kind of fidelity Ninja Theory is shooting for with its next game, Hellblade: Senua’s Sacrifice.

On a wet, dreary August afternoon, I prepare for my first motion capture performance. An engineer says hello and starts sticking various balls to my suit, covering important joints and muscles. I then slip my shoes inside some special wraps, kept in place with bright pink tape, and grab a peaked cap that can monitor my basic head position. I look and feel ridiculous. In the corner, behind a bank of PCs, another member of the team asks me to stand in a “T” position, arms stretched out wide. It’s time to see what my body is capable of.

The next 10 minutes is a short aerobic workout. I’m asked to spin my arms in a circular motion before rotating my hips and lunging like an Olympic weightlifter. These exercises, I’m told, help the system to understand my body’s full range of motion. Then, on a wall-mounted monitor, I see my character appear. First it’s just a bevy of dots floating in space, then a blue, jellylike figure with no discernible features. Finally a strange, nightmarish warrior appears with bulging muscles and an animal-skull helmet. Branches poke out the back of his head, adding extra height to an already imposing figure.

Melina Juergens, the actress behind Hellblade‘s lead character, Senua, enters the room in another mocap suit. Her setup is a little different from mine, given she has a full digital double in the game. A circular, plastic arm wraps around the front of her face, similar to orthodontic headgear, with an LED light strip and cameras fitted on the inside. Senua soon pops into the scene, a powerful Celtic warrior covered in cuts and symbolic blue body paint. We are standing on a beach, with a huge tree behind us covered in flames and hanging bodies. It’s a dark, sinister scene, but my first reaction is to dance around like a drunkard at a jamboree.

The Viking warrior matches my movements, and for a moment, I’m lost in the magic. I spend the next half hour with Juergens dancing, pretend fighting and playing the most surreal game of red hands. All the while I’m looking over my shoulder at a wall-mounted monitor, marveling at how the scene is able to render my movements with zero perceivable lag. Antoniades seems to be enjoying the moment too. He glides around the room with a two-handed camera grip that’s also fitted with motion-tracking balls. There’s nothing inside the cradle, however — it’s merely a prop to move the perspective, or virtual “camera,” inside the world of Hellblade.

hellblade4lolo-1.jpgdims?resize=2000%2C2000%2Cshrink&image_uMocap suits are an unflattering but essential part of performance capture.

Ninja Theory has a long history of using technology to push the visual quality of its games. In 2007, the company released Heavenly Sword, a hack-and-slash adventure for the PlayStation 3. The cinematics were crafted with motion capture technology developed by Weta Digital, a visual effects company in New Zealand co-owned by Peter Jackson. The star-studded cast included Andy Serkis, best known for his role as Gollum in The Lord of the Rings movies, and Anna Torv, who played FBI agent Olivia Dunham in the J. J. Abrams sci-fi drama Fringe.

Heavenly Sword was one of the first games to use performance capture. While the combat was criticized for its repetitive nature, reviewers praised the “stellar” character performances and “stunning” presentation. The team took a similar approach with Enslaved: Odyssey to the West in 2010, once again using performance capture with an experienced TV and movie cast that included Serkis and Pretty Little Liars regular Lindsey Shaw. The game was flawed, but with Ex Machina director Alex Garland as co-writer, the press commended its “strong” script and oftentimes “beautiful” visuals.

After the divisive DmC: Devil May Cry reboot and its work on Disney Infinity 3.0, Ninja Theory went fully independent. Before, its games had been funded by Sony and juggernaut publishers Bandai Namco and Capcom. But middling sales and an increasingly competitive market, in which blockbuster games are expected to shift millions, made it tough for the team to pitch a title that wasn’t “design by spreadsheet,” in Antoniades’ words. So with a team of just 13, Antoniades decided to change tack and make the equivalent of an indie film — a beautiful and artistic game but self-published and with a budget magnitudes smaller than at a normal triple-A studio.

Antoniades says Garland is partly responsible for the shift. “Towards the tail-end of Enslaved, in which he worked with us quite deeply, he said that he just couldn’t understand why, in the gaming world, we went from the bedroom to blockbusters, and there wasn’t the equivalent of independent movies. Movies that can sit alongside the blockbusters in the cinema, but aren’t seen as second tier or cheap in any way. That stuck in my mind.”

hellblade3-1.jpgaside_600.jpgMelina Juergens plays Senua in Hellblade.

With Hellblade, Ninja Theory had to work differently. The team is chock-full of development experience but couldn’t rely on the tools and workflows it had used before. It didn’t have dozens of people to meticulously design and create levels, for instance, or access to Weta Digital, which was working on films like Godzilla, The Hobbit and Planet of the Apes.

So Ninja Theory started experimenting. If an expensive solution wasn’t available, the team would try to research, prototype and build something cheaper. “It’s the hobbyist approach,” Antoniades explains. “In theory, a lot of the techniques and high-end techniques are actually, fundamentally, quite simple. So it’s just being daring enough to say, ‘Well, maybe we can just find a shortcut through this, and find another way.’”

“It was always stealing, borrowing, inventing as we went. We felt like rascals.”

Early on, for instance, the team looked at photogrammetry, a way of measuring depth through photos, to create a face scan of Juergens. Later, the team built Hydra, a prototype camera rig with multiple GoPros and detachable lenses to track the actor’s face and body movements as well as the position of the filmer in 3D space. Another prototype used a cricket helmet and a webcam to record faces. At one stage the team had a lighting system housed inside a plant pot, powered by a Raspberry Pi and some custom code, to capture skin and other surfaces in minute detail.

Some worked, some didn’t. When it couldn’t solve a problem on its own, Ninja Theory reached out to specialists who were willing to collaborate. “It was always a fight,” Antoniades recalls. “It was always stealing, borrowing, inventing as we went. We felt like rascals.” But the team never felt restricted or disheartened. Vicon, an expert in motion capture systems, loaned the studio 12 of its Bonita capture cameras. Ninja Theory then converted its largest meeting room into a mocap studio, mounting the cameras on Ikea poles and lighting its actors with cheap LED panels from Amazon.

hellblade6-1.jpg

In Hellblade, the player guides Senua through the Viking underworld of Hel. It’s a dark, mysterious place filled with rain, fog and stony ruins. The Pictish fighter carries the head of her former lover, Dillion, in a dirty sack, hoping to bargain for his soul with the ruler of this strange realm. She suffers from psychosis, which comes through in the game as whispering voices and twisted, frightening visions. Her journey through Hel is also one of the mind, helping the player to understand, at least in part, the people and events that have caused her so much trauma.

It’s a brave, ambitious concept. Ninja Theory is promising a personal, emotional tale that tackles mental health in a way rarely seen in video games. The narrow focus helped the team with development. Senua is the only character that’s portrayed with a full 3D model, dialogue and facial expressions. That meant the team could channel its efforts into making her the most realistic and believable heroine possible. It also meant, however, that the game would thrive or die based on her depiction. “I wanted this game to be about a character, and I wanted that character to be the best character we’ve ever done,” Antoniades says.

But what came first, the concept or the need to keep the game on a smaller scale? “I think they went together,” he says. “The idea can only survive if it’s achievable. I did want to do a game based on a character’s story, and I knew that we could only afford to focus on one character, in terms of technology and resources. So, then the question became, ‘How do we make this truly intimate story about one character? Is it even possible to carry a whole game with just one speaking character?”‘

hellblade1lo-1.jpgtameem_600.jpgNinja Theory co-founder and Hellblade creative director Tameem Antoniades

Partway through development, Ninja Theory realized that it needed a higher resolution face scan of Juergens. The team reached out to 3Lateral, a company in Serbia specializing in 3D scanning and character rigging, which is the underlying skeleton, or puppet strings needed to power a virtual person. Antoniades was up front and explained that the team had a modest budget but wanted to make the best character in the industry. “People respond well to that kind of thing, because they want to show off their stuff as being the best as well,” he explains.

Coincidentally, 3Lateral had been developing a new, prototype scanning system in secret. It was at this stage that Antoniades asked Juergens, who had been a stand-in actress for the project, whether she wanted to play Senua in the final game. She agreed, and the team quickly booked a flight to the Balkans. “It was cutting-edge tech,” Antoniades says. “It was just unproven at the time, and we were the guinea pigs. But it worked beautifully. The detail was just incredible.”

“It was just unproven at the time, and we were the guinea pigs. But it worked beautifully. The detail was just incredible.”

Next, the company turned to Cubic Motion, a team in Manchester focused on computer vision. Its technology serves as a middleman in performance capture, tracking and analyzing the actor’s face while she performs in the studio. The resulting data — a 3D point cloud, consisting of roughly 200 virtual markers — is then read and replicated by the digital rig controlling Senua’s face in the game. The best part is that Cubic Motion can gather this data with video footage alone, removing the need to plaster the actor’s face with Ping-Pong balls or crosses.

“We can track 30 to 40 points just on the inside of the lips, and you could never get any of that from an optical-based system,” David Barton, a producer at Cubic Motion, explains. His team has worked with 3Lateral before, combining its computer vision system — known in the industry as a facial solver — with the latter’s rigs. It’s a perfect partnership; after all, granular facial tracking is pointless if the rig powering the digital character isn’t capable of replicating the same subtleties.

hellblade2-1.jpg

Hellblade is built on Epic Games’ Unreal Engine. In early 2016, Kim Libreri, the company’s CTO, visited Ninja Theory’s offices to see how its latest project was progressing. Before joining Epic, Libreri was Chief Strategy Officer at Lucasfilm and worked on visual effects for more than 25 films, including The Matrix, Speed Racer, Poseidon and Super 8. “He invented bullet time in The Matrix,” Antoniades says simply. “But he’s not one of these hotshot, Hollywood-type people who look down on games. He’s a lifelong gamer who sees video games as being at the cutting edge of innovation.”

Libreri wanted to showcase Unreal’s capabilities with a real-time motion capture demo at GDC, a prestigious video game developer’s conference. Ninja Theory had the assets and collaborators to make it happen and immediately agreed to Epic’s proposal. “We thought it would be a cool demonstration of how game engines bring something very different,” Libreri says. “Normally, you would only associate that kind of fidelity — from an animation and lighting and texting perspective — with movies. And we were like, we can use pretty much the same techniques but do it live, because of the power of the Unreal Engine.”

The only problem? GDC was eight weeks away. Ninja Theory, Epic, 3Lateral, Cubic Motion and Xsens, a company brought in to handle body tracking, needed to move quickly. For Cubic Motion, it was particularly tough. Typically, the team takes hours to crunch, or “solve,” facial data gathered during a mocap shoot. “Now we had about sixteen milliseconds to track, solve and output that data to Unreal,” Barton explains. Thankfully, Cubic Motion had been working with Ninja Theory for some time and had been training its system to work with Juergens’ face. Still, it needed some refinements.

A week before the presentation, the system was barely working.

“When she’s driving it live on the big screen, there can be no tracking errors, certainly no catastrophic tracking errors, because that would just make her face explode, for example,” Barton says.

A week before the presentation, the system was “barely working,” according to Antoniades. All five companies spent the last three days in San Francisco fighting to iron out the kinks. “It was like an operations room,” he recalls. “I saw it as compressing two years of R&D effort by lots of different companies into a few weeks.” But everything came together. On the day, Juergens was able to drive Senua without any problems. Once the scene had ended, Antoniades explained that it was, in fact, an actor controlling the character live. The crowd went wild as Juergens sang “Do You Want to Build a Snowman,” dispelling any fears that the presentation had been faked.

Barton says he felt relief more than amazement or pride. “Because there are a lot of things that can go wrong in a real-time demo,” he says, “especially when it’s the first time anyone has done it at that level. So it was relief, but also a lot of pride that it came together in such a short amount of time.”

Later that year, Ninja Theory demonstrated the technology again at Siggraph, a conference for visual effects and interactivity. It was part of a real-time graphics competition that included Pixar, Industrial Light & Magic, Oculus, Square Enix and Uncharted developer Naughty Dog. For its second outing, Ninja Theory showed how it was possible to shoot, capture and edit a scene using performance capture and Sequencer, a cinematic editing tool that runs inside Unreal. In this version, Juergens performed twice in quick succession — once as Senua and a second time as a projection of her inner voice.

On the second time through, Juergens was able to act against her previous performance. Both takes were then combined inside Sequencer to create the final scene. It was enough to impress the judges and land Ninja Theory the award for Best Real-Time Graphics and Interactivity showcase. “We had just created a whole scene with two characters talking to each other,” Antoniades says. “Camera, framing, environment, everything. I think that really demonstrated how powerful it is.”

hellblade8-1.jpg

The real-time motion capture system was finished quite late in Hellblade‘s development, so Ninja Theory only used it for a few scenes in the final game. But the studio and necessary tools are now a permanent fixture at the company’s offices, meaning Antoniades and his team can push the technology further in future projects. “It’s definitely something that we can take forward,” he says.

The technology should make Ninja Theory more efficient in the future. Following a shoot, an animator might still go in and fine-tune the character’s movements. But what they’re given should be closer to what a studio would consider final quality. The animator can then spend more time on the finer details or finish up faster and move on to other tasks. Real-time motion capture also allows directors to review footage on set and provide better feedback to actors. No longer do they have to imagine how a performance will look in the final cut. That in turn should result in better takes and fewer frustrating reshoots.

“You’ll be able to have this sort of mass-performed digital theater of the future.”

Real-time motion capture could enable new kinds of experiences too. “The fact that you can live-drive a character means that a famous character from a video game can now be interviewed, as if it was you and me talking right now,” Libreri says. “The same goes for concerts or live performances for people at home, either watching through a web browser or in VR. You’ll be able to have this sort of mass-performed digital theater of the future.”

Hellblade is Ninja Theory’s attempt to show that independent games can still have jaw-dropping visuals. Regardless of how the game is received, it’s hard to argue with the quality of the cinematics. With a team of just 20, Ninja Theory has produced some truly dramatic and emotional scenes that rival the best in the industry. And along the way, it’s pioneered a new form of motion capture with a hobbyist attitude that nothing is impossible.

“If you focus on one thing and want to do it really well, anything’s possible,” Antoniades says. As I perform the macarena in my mocap suit, watching a strange, Viking warrior follow along, I can’t help but agree.

%d bloggers like this: