Skip to content

Archive for

14
Feb

ASUS, Project Tango, and what could have been (ASUS ZenFone AR Review)


Dat leather, tho.

Though it was released back in the second half of last year, the ASUS ZenFone AR remains an intriguing device, sporting Google’s now-defunct Project Tango on board. Though Tango is gone, its spirit will live on in ARCore – and we can, perhaps, derive a few insights on the ARCore from the apps and functionality built-into Tango on the ZenFone AR. What follows is an honest and unbiased assessment of ASUS’ Project Tango phone, both in the perspective of the past and with the benefit of hindsight.

Build

The last few months of smartphone releases have seen a rather sudden about-face in design standards. Where we used to see unibody, milled alloy frames we now see full-body glass designs – Samsung, HTC, and LG are all firmly on-board that train, because absolutely no one pays attention to Apple’s terrible decisions from 2012, apparently. Crafting a shell out of of glass – no matter how “durable,” as if that’s a thing that a millimeter thick piece of glass can be – was an awful idea when Apple did it, and remains an awful idea with the iPhone X, Galaxy S8, V30, and U11/+. Glass frames may be gorgeous and feel fantastic and absolutely reek of premium, but let’s be blunt with ourselves – when you build a phone with glass, the first thing anyone is going to do is slap a case on it, eradicating all that Fancy in one fell swoop.

Fortunately, the ASUS ZenFone AR wasn’t released in the last few months, so it missed the memo. Instead of that shiny-but-fragile glass exterior that’s en vogue, it’s got the milled, alloy unibody of yester-month. In addition, the back surface is also covered in a soft, burnished black leather that’s quite delightful to the touch. I honestly didn’t know I wanted this in a phone until I handled the ZenFone AR – now I never want to put a case on it.  Unfortunately, my review unit also came stamped with a Verizon logo (which is really a nice touch, but a carrier quite literally imprinting its name on a phone reeks of insecurity to me – and insecurity is not sexy), which hurts the aesthetic a little.

A BUTTON.

Side controls are very standard.

Yup. Side controls.

Type-C and a 3.55mm jack!

Super boring top.

Craziest camera sensor ever.

Dat leather, tho.

Carrier interference aside, the ZenFone AR is a handsome device – sturdy in the hand and with a feel that’s truly pleasing in-hand. I can’t say there’s many phones I’ve touched just to feel them in my hand, but this is one of them.  The phone is mostly made of sleek, clean lines, with two exceptions – the camera bump, and the fingerprint sensor. The former of these is understandable; in a phone that has invested so heavily into its camera – detailed below – it’s perfectly reasonable that it’ll have a larger-than-average impact on the overall frame of the phone. The camera itself is surrounded by a heavy-duty-looking metal plate, which gives it almost an industrial look.

Now for that latter problem; instead of a soft button, a la OnePlus, or a recessed hard-key, like LG’s V30, the fingerprint sensor/Home button is a raised, rectangular hulk of a key that shatters the otherwise elegant profile so lovingly crafted by ASUS.

In older Android devices, we saw a lot of protruding physical buttons – the original devices even had trackballs. But for the most part, we’ve seen a shift to capacitive, software or (at the very least) flat buttons in lieu of physical ones to keep the sleek profile and flat surfaces of a phone unblemished. Why, then has Asus deigned it necessary to have a physical button as its fingerprint sensor/Home button? On a phone with a front face that’s otherwise 100% smooth, there is one, single rectangular button that crushes the dream. Asus likely made it protrude to make the sensor easy to find by touch – and I totally accept that design logic. But after using the OnePlus 3T and its flat, capacitive fingerprint sensor rimmed by a barely-perceptible plastic bumper, the ZenFone AR’s solution just feels like a blunt instrument.

Display

The display on the ASUS ZenFone AR is among the prettiest I’ve ever seen on a smartphone, point blank. It doesn’t have the edge-to-edge display or the 18:9 aspect ratio that’s become so popular since the Galaxy S8 was released, but the quality of the display itself is excellent. At 5.7″, the Super AMOLED screen displays colors with startling vibrancy and, while the 2560 x 1440 resolution renders graphics with great clarity. It’s got solid maximum and minimum brightness settings, performing equally well in bright sunlight and darkened rooms.This performance can possibly be attributed to ASUS’ integrated Tru2Life technology – which sounds suspiciously like a buzzword, but it can call it whatever it likes when the screen looks this good.

Build-wise, the display is a little disappointing when viewed through the lens of modern trends. It’s got a mere 79% screen-to-body ratio, which is a fairly far cry from the 90% we’re seeing on newer releases. Viewed from the time of release, though the display is perfectly adequate. ASUS went with Corning Gorilla Glass 4 on the ZenFone AR – always a good decision when looking to shore-up the (relative) durability of a smartphone’s display. It likely won’t survive a direct drop, but Gorilla Glass 4 is definitely scratch resistant and weathers everyday use admirably.

Internals

When it comes to internal hardware, the ZenFone AR is an interesting beast. It runs a Snapdragon 821 processor with a whopping 8GB of RAM (6GB on the lower tier model), meaning that it can handle all but the most system-intensive apps and multitask like an absolute champ, even in split-screen. Two iterations of Snapdragon processors have hit the market since the 821 was released (with the 845 having just been announced in December), but the 821 is no slouch. Even when pitted head-to-head with LG’s latest and greatest – the V30 – ASUS’ ZenFone AR more than holds its own. While it isn’t quite the bleeding edge, the 821 is still a formidable processor in today’s market. It’s the same chip used in the original Pixel, OnePlus 3T, and LG G6, and a step up from the 820 used in the Galaxy S7, LGV20, and Moto Z Force. The upper tier of the ZenFone AR also features 128GB of storage (64GB on the lower); more than I, personally will ever need and more than enough for most users.

In terms of connectivity, the AR features modern, though not bleeding-edge, standards; Bluetooth 4.2 and 802.11ac Wi-Fi. It would have been great to see Bluetooth 5, since the standard became available for implementation last year – and, apparently, can be enabled via software update – but we can’t rightfully blame ASUS for going with the more mature 4.2 standard when the ZenFone AR was released back in July.

The USB-C connector on the ZenFone AR features QuickCharge 3.0 and BoostMaster Fast Charging which, according to Asus, can take the AR’s  3300mAh battery from 0 to 60(%) in just under 40 minutes. Not quite the speed of Dash Charging on OnePlus phones, but still impressive. That same USB-C connector is also Display-Port compatible, meaning it supports Video Over USB – something that early USB-C adopters like the Nexus 6P didn’t support, much to my chagrin.

Sound-wise, ASUS throws a bunch of buzzwords at consumers to make the ZenFone AR sound like an audial beast; 5-magnet speakers (how many magnets to smartphone speakers normally have?), 140% louder (than…what, exactly?), 17% low-frequency extension (what, even?), and 7.1 channel virtual surround sound (hey, I know what that means!). In practice, the AR’s speakers are noticeably louder than those on my trusty OnePlus 3T, and there’s even a toggle-able “Outdoor Mode” that boosts volume further for use in loud environments – though, why they didn’t just make that boosted volume the top-end of the volume slider, I’ll never know (maybe ASUS’ version of turning the volume up to 11, perhaps?).

The AR also sounds fantastic when plugged into good quality headphones – so while I can’t independently confirm the fancy buzzwords above individually, I can sing the praises of the sound performance of the ZenFone AR as a whole – it’s great. Best, perhaps when plugged into the 3.5mm jack with a high-end, over-ear headset to take advantage of those aforementioned buzzwords, but solidly performing regardless of how you listen – be it the internal speakers, Bluetooth buds, or a full-size headset.

Camera

Craziest camera sensor ever.

Aside from – or perhaps because of – Project Tango, the ZenFone AR’s most impressive feature is its camera – or rather, cameras. Many higher-end phones these days have dual rear cameras for producing Bokeh and Depth-of-Field effects; the ZenFone AR adds one more. The primary sensor is a 23MP beast designed to make your photos look as good as the real world does to your eyes. The other two sensors are more specialized; a depth sensor and a motion-tracking sensor. Individually, these sensors are neat little tricks – combined, though, the trifecta allows the ZenFone AR to track itself in space, measure distance, and – in ASUS’ words – “…to a create a three-dimensional model of its surroundings and track its motion, so it can see the world just like you do.”

In practice, I’ve found the camera to be very impressive, albeit a bit convoluted. There are a total of 17 modes across the rear and selfie cameras, each with its own set of completely customizable options; Auto, Manual, HDR Pro, Beautification, Super Resolution, Children, Low Light, QR, Night, Depth of Field, Filter, Selfie, Panorama, Miniature, Time Rewind, Slow Motion, and Time Lapse. Some of these (like QR, for example) are so wonderfully obvious its a small marvel that Google’s own AOSP app doesn’t have it yet, while others (Child) seem like they should have been left on the cutting room floor.

Modes

  • Automatic: Detects the environment and dynamically scales settings for the best possible image. In my experience with the camera, this setting works well for 90% of all the pictures you’ll take.
  • Manual: Tweak all the settings in the Automatic mode (of which there are Legion) to your heart’s content to find that perfect shot.
  • HDR Pro: “Expands the dynamic range and enhances details in high-contrast or strongly backlit screens” – think shooting into the sun. Honestly, I didn’t notice a whole lot of difference with this mode from Automatic.
  • Beautification: Takes your face – with all its beautiful flaws and perfect imperfections – and makes you look like a china doll. When people talk about Instagram filters ruining our perception of beauty, this is what they’re talking about.
  • Super Resolution: According to the description of this mode, the camera takes multiple shots and stitches them together to form the best possible image – which sounds a lot like the HDR Burst that Google uses, with a different name.
  • Children: This mode takes a picture of people’s faces when they stand still for a moment – why it’s called Children mode, I don’t know. But it works as advertised.
  • Low Light: Rather self-explanatory; this mode enhances light sensitivity for clearer pictures in low-light environments without using the flash.
  • QR: This one should be in literally every camera app known to man. Scan a QR code.
  • Night: This mode features a slower shutter speed – and longer exposure – to capture more light in night-time shots.
  • Depth of Field: This is what the cool kids are calling Bokeh – in-focus foreground, out of focus background.
  • Filter: Apply one of a dozen or so overlays to your photos.
  • Selfie: While it sounds fairly standard, this mode is actually one of my favorites. Using the 23MP rear camera and its sensors, the ZenFone AR detects a number of faces you determine, then starts an audible countdown when everyone is in focus. Very useful for taking group selfies.
  • Panorama: Android’s had this one a while.
  • Miniature: As someone that likes to paint pewter and plastic miniatures, I assumed this one would help me take pictures of them – in reality, it’s designed to make life-sized objects look like small-scale model. Why? I’m not actually sure.
  • Time Rewind: This mode essentially functions as a pre-emptive burst, taking photos up to 3 seconds before and a full second after the shutter is pressed. The phone automatically analyzes the resulting images and shows you the best one – but allows you to choose from any of them. Very interesting, though I have not found a real-world application for it yet.
  • Slow Motion: I was really happy with the result of this mode, which is like standard video, but slows down fast-moving objects into slow-motion.
  • Time-Lapse: A classic. Renders video in a slower-than-normal framerate and plays it back at normal speeds, so it looks like time is moving much faster than normal.

She organized our books. By Color.

Ooooh, glowy.

Puppy.

Sleepy puppy.

Wine is love.

Look at those greens!

I am Groot.

We are Groot.

Me, no filters.

Me, “Beautified.” It somehow got worse.

Me, using selfie mode – this was auto-detected and auto-taken using the rear camera.

A 28mm fantasy miniature taken on Auto Settings.

Here’s that same miniature with Depth of Field enabled.

Apparently miniatures can be Beautified, too.

Standard HDR picture.

“Super Hi-Res” picture. Looks PRETTY similar.

Terribly creepy wooden bunny that magically appeared in my closet.

Said creepy bunny, in Low Light Mode. Decent improvement.

Project Tango

The idea behind Project Tango was great; pair robust, innovative hardware with software to turn your phone into an augmented reality playground sophisticated and fine-tuned enough for games applications alike. Google saw Tango as a platform that would revolutionize gaming, design and, really, how we interact with the world. Though no longer in development (having been unceremoniously dumped for ARCore last year), we can still glimpse what Tango could have been by exploring the fledgeling ecosystem in the last of the ‘Tango’s – the ASUS ZenFone AR. Even with the project dead, Tango still has a number of functional apps and games still available for download; games. floor-plan mapping, interior design, and virtual reality are all represented in the Store, making use of the ZenFone AR’s additional sensors to craft immersive experiences.

In practice, though, it’s easy to see why Google decided to go in a different direction. Project Tango is a cute trick, but it hardly justifies the need for a ridiculously high-tech camera on a phone that will likely not make near enough use of it. Project Tango, as shipped on the ZenFone AR, is a niche product – it was never going to make a splash with the common consumer, and probably not even with its intended audience. Google’s new solution, ARCore, shouldn’t need the specifications that Tango did – and as such, will not require the (likely expensive) extra hardware that ASUS managed to admirably cram into a smartphone.

Value

At $600+, the ZenFone AR is no budget phone – but with a strong CPU and 6GB of RAM (at minimum!), it still competes with some of the heavyweights on the market at a price that’s a bit lower than most of them, with the added benefit of the very impressive (albeit now-defunct) Google Tango technology. With Tango no longer in development, this is a hard purchase to justify, but the ZenFone AR is an extremely well-performing phone for its price point with gorgeous build quality, an impressive camera, and a beautiful display.

Buy the ASUS ZenFone AR at Amazon

14
Feb

UV lights in public spaces could kill airborne flu viruses on the spot


Eugenio Marongiu/Getty Images

Could overhead ultraviolet (UV) lights in offices and public spaces be used to stop and kill airborne flu viruses in their tracks? Quite possibly yes, claim researchers at the Center for Radiological Research at Columbia University Irving Medical Center.

For decades, scientists have known that certain wavelengths of ultraviolet light are able to effectively kill bacteria and viruses by destroying the molecular bonds that hold them together. This type of UV light treatment is widely used to decontaminate surgical equipment. Unfortunately, it can’t be used in public spaces because it is a health hazard which can lead to both skin cancer and cataracts. Fortunately, a way around this problem may have been discovered by Columbia scientists — and it involves a type of UV light called far-ultraviolet C, also known as far-UVC.

“We are using a specific type of ultraviolet light to selectively damage viruses and bacteria while being safe for human exposure,” David Welch, an associate research scientist at Columbia, told Digital Trends. “The physical basis of this lies in the limited penetration depth of far-UVC – it can damage viruses and bacteria which are very small in size, while it is unable to penetrate and cause damage to human cells which are much larger. Far-UVC is distinct from conventional UVC lamps which are dangerous because those wavelengths penetrate much further.”

In a study, the researchers demonstrated that far-UVC light can inactivate aerosolized viruses like the common H1N1 strain of flu using only very small doses of light. The experiment involved releasing H1N1 into a test chamber and exposing it to low doses of far-UVC light, and then comparing it to a chamber that was not exposed to the far-UVC light. The chamber exposed to the light saw the flu viruses deactivated with the same efficiency as conventional germicidal UV lights. What makes this extra exciting is the fact that such far-UVC lamps cost less than $1,000 per lamp — and this cost would undoubtedly be driven down significantly were they to be mass produced. In other words, this technology could be made available to the masses.

“Our next steps are to continue our safety tests,” Welch continued. “Up to this point we have mainly looked at short term endpoints to determine damage, and these have all shown that far-UVC is safe. Future testing will extend this to examine the effects of exposure over a longer time period.”

A paper describing the work was recently published in the journal Scientific Reports.

Editors’ Recommendations

  • Awesome tech you can’t buy yet: Heated scarves, Edison LEDs, smart showers
  • Color-changing temporary tattoo reveals when you’ve soaked up too much sun
  • If young blood can really halt death, things are going to get weird
  • Don’t put glue in your eyes — unless it’s this superglue for punctured eyeballs
  • How ‘speed breeding’ will supercharge farming to save us from starvation


14
Feb

A.I. is ready to advise us on how to best protect Earth from deadly asteroids


When people talk about using artificial intelligence to solve humanity’s biggest problems, there are few problems bigger than our planet’s survival. That is something a new algorithm called “Deflector Selector” is designed to aid with — by weighing up different possible solutions to deal with the possibility of a deadly asteroid heading in Earth’s direction.

“Our goal was to build a tool that would help us make funding and research decisions for asteroid deflection technologies,” Erika Nesvold, formerly at Carnegie Institution for Science at Washington, D.C, told Digital Trends. “A lot of different technologies have been suggested for deflecting an asteroid on a collision course with the Earth, including the three we describe in our paper: Nuclear explosives, kinetic impactors, and gravity tractors. But none of these technologies has been fully developed and tested in space, and some of them will work better than others.”

Nesvold and team started out by simulating attempted deflections of an asteroid on a collision course with the Earth and calculating the likely success of each method of deflection. As you would expect, this involved some heavy-duty math and computer processing power — since it meant simulating the potential distance of detection for more than 6 million hypothetical objects and the velocity change that would be necessary to change their course. To speed up the process, the team used machine-learning techniques.

“We used this data to train a machine-learning algorithm that could make this determination much faster than our simulations,” Nesvold explained. “So now we can feed in the characteristics of a population of impactors, and the algorithm can tell us which technology or technologies would work best.”

The results? That nuclear weapons can help dispatch around half of the potential objects, while kinetic impactors and gravity tractors score lower. Granted, a 50 percent chance of survival for humanity isn’t great — but the idea is that tools such as this can help decide which technologies we should be focusing our finite research budgets on.

“We’re hoping that the next steps will be to work with other experts in the asteroid deflection field to improve the Deflector Selector model to get more accurate results,” Nesvold said.

You can read a paper on the Deflector Selector project here.

Editors’ Recommendations

  • The most powerful acoustic tractor beam could one day levitate humans
  • This A.I. eavesdrops on emergency calls to warn of possible cardiac arrests
  • Scientists create unpowered 3D-printed objects that can communicate via Wi-Fi
  • McLaren is testing an electric car, but a production version is a long way off
  • Stem cell breakthrough may give people with paralysis their sense of touch back


14
Feb

Want your pet to age gracefully? Study finds tablet games can help


Messerli Research Institute/Vetmeduni Vienna

People will do just about anything for their dogs but spoiling our pets can often do more harm than good. Dogs need stimulation, discipline, and a reasonable amount of mental exercise, even in old age. The problem is as dogs mature, their humans tend to get lax on training and let their pets get away with lounging around and acting all stubborn.

Now, researchers from the Clever Dog Lab at the University of Veterinary Medicine in Vienna are conducting trials on a high-tech way to keep dogs engaged into old age. In a study recently published on the ACM Digital Library, the researchers used tablet games and touchscreen devices as a form of brain training for dogs, which they say can help pets stay more cognitively alert.

“We have been training and testing pet dogs since around 2008,” Lisa Wallis, a researcher and first author of the study, told Digital Trends. “Dogs of all ages … and different breeds and mixed breeds travelled to the lab and were trained by researchers and dog trainers. Over the years we have perfected our training technique and protocol. Due to the fact that we had a project running on senior dogs, we used the touchscreen to look for age differences in cognitive abilities. Many of the dog owners who participated were very skeptical about whether their dogs could learn to use the touchscreen. Indeed, the older dogs took longer to train, but were able to learn the protocol and to complete the tasks we set them.”

Within the Clever Dog Lab, Wallis and her colleagues put pets to the test with tablet-based brain-teasers that challenged the animals to interact with a touchscreen and complete simple digital tasks. For example, selecting an icon with its nose would elicit a treat from the trainer. It may have taken them more time but even older dogs seemed to eventually get the hang of it.

“The fact that the older dogs were able to learn such abstract and sometimes difficult tasks was very encouraging,” she said. “Not only were they able to learn, but many owners remarked how much their dogs enjoyed their touchscreen sessions.”

Wallis said this makes her optimistic that pet-centric tools like Pup Pod and CleverPet may help young dogs mature and mature dogs stay youthful, and she hopes to see more interactive toys enter the market. However, Wallis cautioned that animals are individuals too and what might positively stimulate one dog might frustrate another.

“Therefore for any pet, it is important to monitor their behavior and look for signs of stress when interacting with technology,” she said.

But if your dog seems to enjoy it, let her play! After all, if humans can benefit from technology, why shouldn’t our pets?

Editors’ Recommendations

  • Baby got Beck: Grandmother’s Instagram inspires rock icon’s new music video
  • These kickass telepresence robots are the next best thing to teleportation
  • Everything we know so far about ‘The Last of Us Part II’
  • Even pets got in on the action at CES with these gadgets for furry friends
  • The best headlamps


14
Feb

Harvard’s insect-inspired HAMR robot scuttles like a cockroach on meth


For many folks, the word “hammer” summons up distinctly 1990s images of baggy hip-hop pants and rapper-dancers going broke. There is nothing retro about Harvard University’s HAMR robot, however. An acronym derived from “Harvard Ambulatory MicroRobot,” it’s a cutting-edge, insect-inspired robot that can scamper along the ground at an impressive speed of a little under four times its own body lengths every second.

Unlike previous versions of the robot that Harvard has built, its latest iteration — the HAMR-F — no longer has to be tethered to a power source in order to function. While that makes it a little slower than its predecessor, it also opens up new possibilities in terms of freedom of movement.

“The Harvard Ambulatory MicroRobot is a quadrupedal robot that is inspired by cockroaches, having a similar size, mass, and body morphology to them,” Benjamin Goldberg, a researcher on the project, told Digital Trends. “Tethered versions of HAMR have previously been shown to run at speeds exceeding 10 body lengths per second and can perform agile turning and dynamic jumping maneuvers. The most exciting development with HAMR-F is that we are now able to take the robot outside of the lab with an onboard battery an electronics, while still maintaining high speeds and maneuverability.”

The ready-to-run robot weighs just 2.8 grams and is powered by an 8-mAh lithium-polymer battery. Eventually, the hope is that it will be able to move autonomously but, right now, it still has to be controlled via a human operator, although this can be carried out wirelessly.

Harvard University

“The application for HAMR-F that we are most excited about is confined environment exploration,” Goldberg continued. “For example, HAMR-F could be used to search for defects within an engine cavity, within a pipe, or behind a wall. Our current version demonstrates really robust tether-free locomotion capabilities — however, many applications would still require sensors such as a camera or other specialized sensing modalities. HAMR-F has a substantial payload carrying capacity of around 50 percent of its own body weight and the circuit boards are compatible with many of these sensors, so hopefully, these types of applications are not far off.”

Goldberg said that the next step for HAMR’s development is to add more exteroceptive sensors, capable of wirelessly transmitting data back to a host machine. “This is a technology that we are very excited about commercializing because we see great potential for cost reduction and automation of some inspection tasks by opening up new pathways in challenging environments and tight spaces,” Goldberg said.

A paper describing the work has been submitted to the journal IEEE Robotics and Automation Letters.

Editors’ Recommendations

  • This tiny, self-folding robot could one day be part of a large ant-like swarm
  • Fujifilm GFX 50S review
  • Olympus M.Zuiko 25mm F1.2 Pro review
  • Fujifilm X-E3 review
  • 2018 Ford F-150 lineup including prices, pictures, mileage, and new features


14
Feb

Love Starman? Sorry, you’ll have to bid a final farewell on Valentine’s Day


SpaceX

If you developed a soft spot for Starman following his spectacular launch aboard SpaceX’s powerful Falcon Heavy rocket a week ago, then Valentine’s Day must seem like just about the worst day possible to bid farewell to your favorite “astronaut.”

But on February 14, the Starman and his Tesla Roadster will finally fade from the view of many telescopes as they drift ever deeper into space.

Don’t have any star-gazing kit of your own? Well, the good news is that thanks to the Virtual Telescope Project, you can still wave Starman off on his adventure, one that SpaceX CEO Elon Musk hopes will last “a billion years,” but which in reality could last hardly any time at all if the car crashes into some space rock or succumbs to radiation.

Starting at 7:15 a.m. ET on Wednesday, the Virtual Telescope Project is planning to live-stream the cherry-red Roadster one final time. Described as “a unique opportunity,” a telescope will show the final moments as Starman drifts into darkness.

The Project, which started in 2006, offers amateur and professional astronomers online access to real, robotic telescopes and offers a range of services for the international community. With the help of Tenagra Observatories in Arizona, it’s been tracking the Tesla Roadster and Starman since February 6, when the Falcon Heavy successfully completed its first launch.

But if you tune in early Wednesday, don’t expect to see a close-up of Starman. The car is just over a million miles away from Earth and currently appears only as a faint dot among a sea of stars. The final live-stream is a chance to say goodbye as it enters what will likely be an orbit around the sun.

The wacky payload was an attempt by Elon Musk to make space “fun” and get people interested and inspired by deep space projects. This is the guy that wants to build a city on Mars, don’t forget.

Oh, and for anyone new to this story — and sorry to break this to anybody who’s formed a deep emotional bond with Starman in the past week — we should just state that he (or more accurately, “it”) is in fact a spacesuit-clad mannequin and not a real person … well, as far as we know.

Editors’ Recommendations

  • ‘It’s still tripping me out’ — Elon Musk on Starman’s space adventure
  • Prepare for liftoff! Here’s 7 crazy facts about the SpaceX Falcon Heavy rocket
  • Musk’s Boring Company is ‘capping cap orders at 50,000 caps’
  • From flamethrowers to brain linking, here are Elon Musk’s 5 craziest ideas
  • Here’s everything you need to know about the SpaceX Falcon Heavy rocket


14
Feb

These drones could team up to lift injured soldiers off the battlefield


Until robots replace human soldiers on the battlefield, it’s unfortunately a reality that soldiers run the risk of getting injured in war zones. In cases where that injury is severe enough, it may be necessary to physically evacuate the individual from the field of conflict to take them out of harm’s way. That’s where an innovative new project from researchers at Georgia Tech Research Institute aims to help.

Rather than resorting to anything as old-fashioned as medics with a stretcher, Georgia Tech engineers have been working on a system that involves teams of drones working together to lift wounded soldiers off the battlefield or, potentially, take civilians from a disaster area. This is achieved by using multiple eight-propellor drones, each capable of lifting a 65-pound object, which, when combined, can lift a person a distance of 500 yards. You can think of it a little bit like a version of Amazon’s proposed drone delivery service — with the exception that it could one day save lives rather than just delivery time.

“The difference between our system and other concepts of ‘drone delivery’ is that we allow multiple vehicles to carry a package together, rather than only using a single vehicle,” Jonathan Rogers, assistant professor at Georgia Tech’s George W. Woodruff school of mechanical engineering, told Digital Trends. “This allows the range of package weights that can be carried to be significantly expanded. For instance, if one drone can carry a package up to 15 lbs, four drones together can carry a package that is 60 lbs. It allows us to scale our lift capacity to different payload weights by adding more vehicles as needed.”

Georgia Tech

As neat and obvious a solution as this may sound, there’s a reason why it hasn’t been widely adopted — it’s a tricky engineering feat to pull off. In order to achieve cooperative flight control, the drones must coordinate with one another on control actions. That’s especially difficult in cases where the center of gravity location of the package (in this case a person) is unknown. The drones also have to be mindful of where their partner drones are located, lest they collide with one another.

To cope with these challenges, Rogers’ team has developed special adaptive flight control algorithms.

“We have already designed and demonstrated a docking device that will allow these drones to connect to packages, which has been demonstrated in a series of flight tests,” Rogers said. “The next steps will include finalizing the cooperative flight control laws and performing flight experiments where multiple vehicles fly to a payload, connect to it, fly it to a destination, and detach from it. This will demonstrate that such a system can work in practice.”

Editors’ Recommendations

  • Watch this coordinated drone swarm fly in tight formation — without GPS
  • Controller? What controller? You fly this drone with your facial expressions
  • CEO takes ride in passenger drone to demonstrate that it’s safe
  • Airbus’ delivery drone takes packages to ‘parcel stations’ run by robots
  • Forget forklifts, these tiny warehouse drones team up to lift big objects


14
Feb

Skydio’s R1 is the smart flying action cam we always dreamed of


Another drone with built-in camera isn’t all that exciting, right? Try telling that to the Massachusetts Institute of Technology engineering graduates at Skydio who this week introduced their new R1 flying action cam. Utilizing 13 cameras to view the world and an Nvidia Jetson chip more commonly used in self-driving cars, the R1 promises to be the smartest flying 4K camera you ever tried — with its guiding mission being to keep you in the frame at all times while dodging any obstacles around it.

“The R1 was designed for full autonomy,” Skydio’s CEO and co-founder Adam Bry told Digital Trends. “With other drones the autonomy features are targeted as pilot assistance, and they’re generally designed with the notion that there’s still a pilot ready to take over at a moment’s notice. Getting beyond this is very technically challenging, but that’s what we’ve been focused on for the last four years — and that’s what makes R1 special.”

The system is based on state-of-the-art computer vision algorithms, which are able to look at raw images and simultaneously balance obstacle avoidance and the need to capture amazing video. Navigating by sight is a frankly amazing challenge, and it is one that R1 seems to be able to pull off pretty darn effortlessly, all while traveling at speeds of more than 20 mph.

At $2,499, it’s not cheap, and its battery life is limited to 16 minutes, but if you’re in the market for a great action cam, the R1 should totally be at the top of your list. Whether you’re an athlete hoping for a hands-free recording of your practices or an outdoors enthusiast who hopes to capture some stunning vistas, this looks like a brilliant bit of tech.

“There’s potential for a broad range of applications, but initially, we see this as a product that sports enthusiasts, adventurers, creators and early adopters will love,” Bry continued. “There are always going to be people that enjoy flying a drone, but the R1 is less about flying a drone manually and more about being able to create content that you wouldn’t normally be able to create.”

We can’t wait to put it through its obstacle-avoiding paces!

Editors’ Recommendations

  • Wish you could fly? Here are the best drones on the market right now
  • DJI Mavic Air review
  • Hover Camera Passport drone review
  • Awesome tech you can’t buy yet: Tiny selfie drones, smart toilet seats, and more
  • Pitta is a palm-sized drone that moonlights as an action cam and security system


14
Feb

These $1 test strips detect fentanyl in street drugs, could curb overdoses


Johns Hopkins

The United States is in the midst of an opioid crisis, which President Trump recently requested more than $13 billion to help control. At the heart of the epidemic is what seems to be an ever-changing supply of drugs making their way on to the streets. Not least of these is fentanyl, a synthetic painkiller often mixed with heroin but many times more potent. It can be fatal even in small doses and was responsible for more than 20,000 deaths in America in 2016, according to the National Institute of Drug Abuse.

Now, researchers have shown that low-cost test strips can be used to detect fentanyl in street drugs, warning opioid abusers of its presence and potentially saving them from a fatal overdose. A recent study by researchers at Johns Hopkins University and Brown University showed that these strips, which cost about $1 each, can assist drug users while informing a public discussion about one of the leading causes of the opioid crisis.

“[Our study] was a multifaceted approach to try to understand if these kinds of technologies work and how they will be accepted by people who need to accept them,” Susan Sherman, a professor of health, behavior, and society at Johns Hopkins, told Digital Trends. “Meaning either drug users of [health] service providers.”

Sherman and her colleagues ran the study in multiple parts, checking the validity of three drug-testing technologies, speaking to 335 drug users, and interviewing 32 representatives from groups that work with drug users. Through their research they found that the low-cost strip had the lowest detection limit and the highest sensitivity in a comparison with more high-tech technologies, and that both drug users and social workers welcomed the strips as a way to keep people safe.

“The strips are really great, particularly in markets where you don’t know how much fentanyl there is,” Sherman said. “If 100 percent of the drugs test positive for fentanyl, you don’t necessarily need to test the drug. But since we never really know the [street] drug market, it’s useful to have strips.”

Sherman responded to the criticism that such tools may enable drug abusers to use more drugs by pointing to decades of evidence from syringe exchange programs, which show that such programs don’t increase drug use.

“It’s misguided thinking that drug users don’t want to protect themselves or their well-being like anybody else would,” Sherman said. “This is a way to help support that.”

Editors’ Recommendations

  • 9 amazing uses for graphene, from filtering seawater to smart paint
  • Mantis shrimp are the inspiration for this new polarized light camera
  • New disease-finding electronic nose sniffs poop so scientists don’t have to
  • UV lights in public spaces could kill airborne flu viruses on the spot
  • FDA approves algorithm that predicts sudden patient deaths to help prevent them


14
Feb

Best RPGs for your PlayStation 4


ps4-rpg.jpg?itok=jLj1p-S8

Looking for some awesome RPG games on your PlayStation 4? Check these out!

Uh oh! It’s game night but your D&D group has canceled. Fortunately for you, there are other ways to scratch your roleplaying itch. If you’re looking for some great RPGs for your PlayStation 4 then you should probably check out this list.

  • Nier: Automata
  • Monster Hunter World
  • Undertale
  • Persona 5
  • bloodborne

Nier: Automata

If you are a passionate fan of video games then you owe it to yourself to play Nier: Automata. Developer PlatinumGames uses Nier as an opportunity to work in the language of gaming to create a lush and stunning meta-portrait of the medium itself. If gaming were to gaze into a mirror and find that it was staring back then this game would be the end result.

As an expansion of the Drakengard series, Neir Automata does not require you to play all the previous titles in the series in order to enjoy it. It is an action roleplaying game which you can enjoy on a surface level but as you work your way through it you will be rewarded with fantastic storytelling and a high level of replayability as the game morphs into something new with each play through.

See at Amazon

Monster Hunter World

Monster Hunter World was just released by Capcom in the very beginning of 2018 and it’s already on track to become one of the best games of the year. Capcom managed to take the popular action RPG and give it an approachability the series has never had until now.

Travel to the New World to stalk and take down massive and stunning monsters. As you progress you will use resources from the monsters you slay in order to create weapons and gear to go on greater and more dangerous expeditions. Monster Hunter World has more reasonable learning curve than previous entry’s but still has the depth of mechanics you would expect from a Monster Hunter game.

See at Amazon

Undertale

Undertale is one man’s love letter to RPG games. Developed and published by Toby Fox, it was originally released on PC in 2015 and saw a PlayStation version come out in 2017. Undertale has garnered much praise and awards for its innovative gameplay and writing.

While it may not be the most beautiful RPG on this list it is certainly dripping with charm. The writing is at once fun and goofy while still maintaining a certain heart and beauty. If you love RPG games and want to try something that acknowledges the long history of the genre while still doing something entirely new then you should give Undertale a spin.

See at Playstation

Persona 5

The sixth game in the Persona series, Persona five easily stands out as one of the best JRPG games in the last decade. As one would come to expect, Persona 5 has a gorgeous anime visual style, solid turn-based combat mechanics, and a dash of social simulation. Take on the role of the main character Protagonist, transfer to Shujin Acadamy and discover and learn to harness your Persona powers.

While Persona 5 is a stellar entry to the Persona series and a laudable RPG in its own right, I would be remiss if I were not to at least mention the painfully regressive and tone-deaf handling of LGBT characters. The only acknowledgment of the existence of anything but straight humans comes in the form of a ham-fisted and grotesque caricature of two gay men. Despite the fact that the series could use an update when it comes to certain social issues, it still manages to be a great JRPG.

See at Amazon

Bloodborne

If you’re looking for an action RPG that doesn’t hold your hand then you might want to consider Bloodborne. In fact, not only does it refrain from holding your hand it might slap it every now and then. If you played either Demon Souls or Dark Souls and found that sort of granular combat enjoyable then you are going to love Bloodborne as well.

FromSoftware takes the sort of grinding combat RPG experience that they do so well and ports it over to a gothic Lovecraftian world to great effect with Bloodborne. Take the role of The Hunter in order to save the town of Yharnam. The town’s safety relies on your combat precision and your ability to not throw your controller across the room in frustration.

See at Amazon

Hopefully at least a few of these games can keep you satiated until next week when you can get back to some classic pen and paper role-playing. Or maybe you will be so ensconced in your new PS4 RPG that you will have to be the one to cancel the next D&D session.

What are your favorite PS4 RPGs?

Why are we reviewing PlayStation 4 games on Android Central? Let us explain.

PlayStation 4

ps4-controllers.jpg

  • PS4 vs. PS4 Slim vs. PS4 Pro: Which should you buy?
  • PlayStation VR Review
  • Playing PS4 games through your phone is awesome

Amazon

%d bloggers like this: