Facebook talks connectivity through drones, helicopters at F8 2017
Why it matters to you
Facebook’s latest connectivity tech could deliver internet in rural regions, disaster areas, and dense cities.

Facebook wants to connect the billions of people in the world who lack an internet connection, and it’s launching a volley of solutions at the problem. At the F8 developer summit on Tuesday, Facebook provided updates on its Aquila drone project, its Tenna-tether portable antenna, and its Terragraph node system.
With its Acquila drones, Facebook said it set a record with millimeter-wave radio, the technology it’s using to beam internet from the stratosphere to terrestrial microwave dishes. Engineers achieved a speed of 36Gbps from a distance of more than 10km, about double last year’s maximum speed of 20Gbps (and fast enough to stream 4,000 Ultra HD (4K) movies simultaneously).
Aquila still has a ways to go — Facebook hasn’t tested the improved millimeter-wave technology on one of its drones yet, instead opting to use a Cessna aircraft circling about four miles away. But it believes that airborne millimeter-wave radio has potential. “The ground-to-air record modeled, for the first time, a real-life test of how this technology will be used,” Yael Maguire, a director for Facebook’s connectivity program, wrote in a blog entry posted during the keynote presentation.
Facebook is also developing a short-term connectivity solution for emergencies: “Tether-tenna.” The nascent project consists of a Volkswagen Beetle-sized helicopter and flexible antenna that can be “deployed immediately and operate for months at a time.” It’s in the early stages, but Maguire said the Tether-tenna will eventually be able to tap into a fiber line, plug into an electrical source, and then rise hundreds of feet in the air to broadcast a signal.
The challenges are myriad, Maguire said. Tether-tenna prototypes have only been able to operate up to 24 hours continuously, and they need to be able to survive high winds and lightning.
A more permanent solution is Terragraph, Facebook’s effort to replace fiber connections in “dense urban areas.” Speaking onstage at the F8 conference, Facebook vice president Jay Parikh described it as a “multi-node wireless system focused on bringing high-speed connectivity” to cities.
Terragraph, like Aquila, relies on open wireless standards to beam millimeter radio waves between wireless nodes. But unlike Acquila, the transmitters are mounted on telephone poles and Ethernet or Wi-Fi hubs mounted on the exteriors of buildings. Facebook said a single distribution node currently maxes out at 2.1Gbps, but that it expects speeds to improve as testing continues.
It’s not perfect. Millimeter wave signals are prone to interference from water, and can’t travel through walls or windows. But ARIES, a new antenna design from Facebook’s Connectivity Lab, will help mitigate the issues. It’s single-base station is capable of eliminating noise and supporting as many as 24 different devices on the same spectrum, Facebook said.
“Slow internet speed is especially prevalent in developing economies where mobile networks are often unable to achieve data rates better than 2G” Facebook explained in a blog post. “Developed economies are hampered by Wi-Fi and LTE infrastructure that is unable to keep up with the exponential consumption of photos and video at higher and higher resolutions.”
Windows Insider update offers power savings and cleans up desktop icons
Why it matters to you
Even if you’re not a member of the Windows Insider program, keep an eye on the latest Insider builds for a sneak peek into the future of Windows 10.
Microsoft is hard at work on the Creators Update, releasing yet another Windows Insider build this week, bundling some helpful new features together, and patching a host of new bugs. Build 16179 rolls out a new power management scheme aptly titled “Power Throttling,” among some smaller improvements throughout Windows.
What is Power Throttling? Well, it is a working title for a new framework Microsoft hopes will improve the battery life users get out of laptops and mobile devices running the Windows 10 Creators Update.
The power throttling framework automatically categorizes every app you are running and effectively squeezes the life out of non-essential background applications. Doing so limits the amount of CPU resources these apps can access and should improve your battery life.
According to Microsoft’s internal tests, power throttling can potentially reduce CPU power consumption by about 11 percent during strenuous use cases. It remains to be seen, though, whether or not everyday users will experience those kinds of gains on a regular basis — or if those numbers are just the best-case scenario from Microsoft’s internal tests.
Like most Insider builds, Build 16179 comes with a robust catalog of bugfixes, adjustments, and minor features. First among which is a new feature for virtual machine users, which Microsoft is calling Revert VM. Ultimately, it creates a checkpoint every time a virtual machine is started, allowing users to restore to the last boot to undo any mistakes.
Additionally, desktop icons will no longer wander around all on their own. Microsoft patched a bug which was causing some users’ desktop icons to move around unexpectedly when “Auto-arrange icons” was enabled. Microsoft also fixed a bug which was causing Hindi language users to experience crashes every time they attempted to launch Microsoft Edge.
As always, you can browse the full details of every new feature, tweak, and minor bugfix in Build 16179 on the Windows Insider blog.
Facebook’s newest tech will let you type with your brain and hear with your skin
Why it matters to you
These breakthroughs may seem lightyears away, but Facebook says they are in fact closer than we think — possessing massive implications for the future of communication in our lifetime.

Facebook has some pretty surprising ideas about the future of communication, and they extend far beyond news feeds and even augmented reality. The social media giant announced during day two of its F8 developer conference that its Building 8 hardware lab is working on technology that will one day allow you to type with your thoughts and hear through your skin.
Our brains, along with the cochleas in our ears, possess the power to reconstruct language from components , and Facebook is looking at hardware and software to transmit those components to the body via pressure changes and vibrations. During the day 2 keynote, the company demonstrated a video of one of its engineers repeating words communicated to her through sensors embedded in a sleeve on her arm. Simple words, like “blue” or “cube,” were sent through a smartphone, and the engineer was able to understand without a single word being uttered.
The concept may seem crazy at first glance, until you consider the work that has already been done in the field, dating as far back as the development of Braille in the 19th century. Facebook is building upon that foundation, with the ultimate goal for one person to be able to “think in Mandarin,” and someone else to instantly “feel in Spanish.”
Regina Dugan, who heads the division and is formerly the head of Google ATAP, said we’re much closer to these goals than many realize. The company is also working on a project that would allow humans to type 100 words per minute using only their brain. Through the use of improved, non-invasive sensors, breakthroughs in optical imaging technology, and machine learning, it’s already a reality as Facebook showed a patient with ALS typing just a handful words with her mind. More than 60 scientists from universities all over the world are working on this project with Facebook to make the goal of typing 100 words per minute with the brain possible.
Still, the social media company is well aware of the privacy questions that will inevitably surface with this initiative. Dugan stated the objective was to achieve the speed of voice with the privacy of text, likening the approach to sharing photos online. We have only several thoughts out of many we’d actually like to share, Dugan said, and Facebook is not interested in broadcasting the random noise in your head.
“Imagine the power such a capability would give to the 780 million people around the world who cannot read or write — but who can surely think and feel,” she said.
Don’t expect to see any of this technology in the real-world this year, but Dugan said maybe in a couple of years.
Facebook’s newest tech will let you type with your brain and hear with your skin
Why it matters to you
These breakthroughs may seem lightyears away, but Facebook says they are in fact closer than we think — possessing massive implications for the future of communication in our lifetime.

Facebook has some pretty surprising ideas about the future of communication, and they extend far beyond news feeds and even augmented reality. The social media giant announced during day two of its F8 developer conference that its Building 8 hardware lab is working on technology that will one day allow you to type with your thoughts and hear through your skin.
Our brains, along with the cochleas in our ears, possess the power to reconstruct language from components , and Facebook is looking at hardware and software to transmit those components to the body via pressure changes and vibrations. During the day 2 keynote, the company demonstrated a video of one of its engineers repeating words communicated to her through sensors embedded in a sleeve on her arm. Simple words, like “blue” or “cube,” were sent through a smartphone, and the engineer was able to understand without a single word being uttered.
The concept may seem crazy at first glance, until you consider the work that has already been done in the field, dating as far back as the development of Braille in the 19th century. Facebook is building upon that foundation, with the ultimate goal for one person to be able to “think in Mandarin,” and someone else to instantly “feel in Spanish.”
Regina Dugan, who heads the division and is formerly the head of Google ATAP, said we’re much closer to these goals than many realize. The company is also working on a project that would allow humans to type 100 words per minute using only their brain. Through the use of improved, non-invasive sensors, breakthroughs in optical imaging technology, and machine learning, it’s already a reality as Facebook showed a patient with ALS typing just a handful words with her mind. More than 60 scientists from universities all over the world are working on this project with Facebook to make the goal of typing 100 words per minute with the brain possible.
Still, the social media company is well aware of the privacy questions that will inevitably surface with this initiative. Dugan stated the objective was to achieve the speed of voice with the privacy of text, likening the approach to sharing photos online. We have only several thoughts out of many we’d actually like to share, Dugan said, and Facebook is not interested in broadcasting the random noise in your head.
“Imagine the power such a capability would give to the 780 million people around the world who cannot read or write — but who can surely think and feel,” she said.
Don’t expect to see any of this technology in the real-world this year, but Dugan said maybe in a couple of years.
Smart bandage uses nanosensors to track how a wound is healing
Why it matters to you
Bandages are currently used to keep dressings clean to avoid infection, but with smart nanosensors they could do so much more.
Bandages are intended to keep a dressing secure and clean in order to reduce healing time and infection rate. However, they may be about to get a new use-case, courtesy of a project from the United Kingdom’s Swansea University Institute of Life Science.
What researchers there have been working on is a new smart bandage capable of tracking how a wound is healing and sending that data back to doctors, via 5G technology. To do this it would employ tiny “nanosensors” able to fit comfortably within the fabric of regular bandages.
The resulting smart bandage would allow doctors and caregivers to know exactly at which stage in the recovery process a wound is, thereby allowing them to tailor their treatment more accurately for the patient.
“Chronic wound management is an initial focus for development, and early application as it is a major challenge for health systems,” Marc Clement, chairman of the Institute of Life Science, told Digital Trends. “Supporting this management outside of the hospital setting. Increasing rates of diabetes and other contributory factors compound this need.”
Unfortunately, Clement wouldn’t spill more details about how exactly the tech works, since the smart bandage concept integrates a number of commercially sensitive technologies its investors are hoping to protect for commercial purposes.
However, the hope is that this technology will be able to be trialed as soon as the next 12 months. “The next stage of research involves integration of the concept into clinical applications [and] pathways, and testing of core technologies,” Clement continued. This work will reportedly involve experts from the Welsh Wound Innovation Centre.
There’s no word on exactly when smart bandages might be available to the general public, but Swansea University is clear about its focus on being at the forefront of the intersection between technology and healthcare. In June this year, it will host a one-day symposium on “Digital Futures in health and well-being,” questioning whether public services can survive without embracing smart technology.
Arm-mounted underwater jet drive looks like it came from a James Bond movie
Why it matters to you
Because who hasn’t wanted to be propelled through the water via arm-worn jet propulsion like a superhero or a 1960s British spy?
We can just see “Q” saying, “Now pay attention, 007!” to Mr. Bond while showing off this new device.
Resembling something straight out of a Sean Connery-era James Bond movie, Scubalec is an exciting new Kickstarter crowdfunding project for a handheld, arm-mounted personal jet drive, designed to propel intrepid users through the water.
“People go snorkeling because it’s a great way to explore the underwater world,” creator Un-Yong Park told Digital Trends. “For those people who love snorkeling, we’ve created a handy device to bring even more fun to the underwater experience. Simply put Scubalec on your arm, and it pulls you just in the direction you point, easy and simple.”
The Scubalec comprises two small jet drives combined with a 7.500 mA/h lithium-ion battery that, fully charged, will provide a running time of 10-12 minutes’ worth of continuous propulsion. Park describes the experience of using it as being “like a cyclist with a tailwind on their back.”

To give it a suitably retro appearance (did we mention this looks like something out of an old spy movie?), Park took his design inspiration from the iconic P-51 Mustang, a single seat fighter aircraft used in World War II, and active from the 1940s through the 60s. The result is undeniably neat, and sure to be an attention-grabber on your next family vacation.
Currently the Scubalec project has raised around one third of its 10,000-euro ($10,700) target, with 28 days still to go. A pledge of $297 will secure you a Scubalec and wall charger, with postage included. Shipping is set for this June.
A more expensive option with an extra battery is also available. You know, in case whichever megalomaniacal evil villain’s island volcano base you’re headed toward happens to be more than 10-12 minutes’ swim away.
Just remember to wear your freshly ironed white dinner jacket and ruffled shirt under your wetsuit, so you’re good to go the moment that you arrive on dry land.
Facebook’s new Surround 360 VR cameras will go on sale this year
Facebook just introduced a second-generation Surround 360.
The social network is kicking off the second day of its (mostly) annual F8 developer conference, and buried in a flurry of updates and sessions there was an announcement that the Surround 360, which Facebook unveiled last year as an open-source VR camera for others, has a successor. Actually, it has two successors: a larger version called x24 and a smaller, more portable version called x6.
- Best 360 cameras: The best VR cameras, no matter your budget
- Nokia Ozo: Pro-spec VR capture doesn’t come cheap
- Samsung Gear 360 camera preview
The x24 has a 24-camera array laid out in an orb, rather than the original 17 cameras arranged in a saucer-shape, while the x6 has six cameras. Last year, Facebook released design schematics for the Surround 360, but now, it’s partnering with hardware manufactures to bring the products to market later this year. Facebook said that it won’t sell the cameras directly, according to The Verge.
Although it didn’t confirm whether they’d still sport the Facebook-brand, the finished products will be entirely based off Facebook’s design and spec guides. The x24 and x6 can capture in 8K with six degrees of freedom (6DOF), which will allow you to move your body all around as long as you’re wearing a VR headset with positional tracking. The Oculus Rift, for instance, would work.
Your average consumer probably won’t buy these basketball-ball shaped 360-degree video cameras, however. They’re professional-grade cameras for high-end image makers and cinematographers. They capture rich depth information and use software that better understands light data and the depth of objects, but footage can still play back in Facebook’s News Feed as standard 360 video.
They can even play back in Gear VR as stereoscopic 3D-360. Keep in mind that with each step down, viewers will lose head and body tracking, etc. Nevertheless, Facebook said these cameras should have big impacts on the 360-degree format, giving developers the ability to create more engaging videos, including ones that will let you move around inside live-action scenes.
- Nikon KeyMission 360 is a 360-degree 4K actioncam
- Ricoh Theta hands-on
- LG 360 Cam: Capture your VR world
The cameras will even let video-makers edit live-action captures with CGI imagery. And finally, Facebook confirmed that the x24 and x6, which don’t yet have firm pricing or release dates, won’t replace the original Surround 360, which is now called Surround 360 Open Edition.
Capcom’s collection of Disney NES games does retro gaming right
Recapturing the feeling of playing video games of bygone eras is no easy task. Sure, a simple emulator can technically drag ancient software to modern television screens, but anyone who grew up playing 8-bit games can tell you the experience isn’t always the same. Pixel-perfect presentations can sometimes lack the nostalgic charm of the distorted, fuzzy tube TVs old games were originally designed for. That’s one of the things that makes Capcom’s Disney Afternoon Collection so special. It doesn’t just collect Ducktales, Rescue Rangers and other iconic NES Disney games in convenient one package — it wraps them in the style, context and visual limitations of the 1990s.
Everything about this Disney collection harkens back to the 1990s in some way — from the neon-triangle game logo and sprinkle-confetti menu backgrounds to the 8-bit music, classic gameplay and retro visual filters. From the moment the game first boots up, it’s poised to put the player in the mindset of the games’ original era. This is part of developer Digital Eclipse’s four-point philosophy of classic game re-releases: Accuracy, Context, Presentation and Sustainability. “All of our philosophies go back to artistic intent,” Frank Cifaldi told Engadget. As Digital Eclipse’s Head of Restoration and the founder of The Video Game History Foundation, he has a passion for preserving classic games as they were intended to be experienced. Figuring out how to do that is a lot more complicated than just putting old code on new screens.

“I don’t think that there’s a definitive answer to the artistic intent of graphics of an NES game,” Cifaldi says. “If you rounded up everyone who worked on these games, you’d get some different answers.” That’s why games built on the developer’s Eclipse engine, like The Disney Afternoon Collection and Mega Man Legacy Collection, allow users to choose from multiple visual modes — including an unfiltered view, a TV mode with CRT effects and a simple scanline “monitor.” setting. That same commitment to artistic intent, however, is also why other display modes were left out.
“When you play an NES game on an emulator, typically every pixel is exactly square,” he says. “But if you were to actually display an NES game on a CRT, it’s a little bit wider than that.” The kind of pixel-perfect displays we use today just weren’t available when these games were originally released, and many developers likely built games with the limitations of CRT monitors in mind. That’s why the Eclipse engine uses pixels that are just slightly wider than they are tall, and and include light color bleeding and afterimage effects in its TV filter mode. “We’re bringing these games back to the way they were meant to be viewed.”
At the same time, Cifaldi couches the developer’s artistic choices with caution. “I put air quotes around that,” he says, repeating that there’s no true, definitive catch-all answer for how old games are supposed to look. “You just have to sort of make a series of decisions that all revolve around what is artistic intent and what is the best player experience.”
When Digital Eclipse developed its retro game porting engine for Megaman Legacy Collection, that meant making stylistic choices that lend the games a CRT feel even when played on a high definition screen. It’s an effort less about simulating ancient TV technology than it is about tricking the eye. “That ghosting that we do? Totally inaccurate. That’s not the ghosting that you would see on a CRT, but it still looks like it to you, right?” It does. The effects developed for the company’s Megaman project carry over well to The Disney Afternoon Collection, presenting the classic games with a subtle TV filter that, despite technically distorting each game’s graphics, make them look better than they might as pure pixel output. At least to some eyes.
This attention to detail given to the games’ visuals are fantastic, but in some ways it’s just a cherry on top of an otherwise great package. Underscored with remixed chiptune music, the game’s neon menu draws users into the 90s mindset, and a digital museum of game boxes, retro advertisements and development artwork provide ample context for the era these games were made in. Even for those unconcerned with the history of the games, it’s a good value, offering a handful of excellent, but rare Nintendo classics for $20.

Any Nintendo fan who grew up watching the Disney Afternoon would be remiss not to look into this collection. Ducktales and its sequel are both solid platformers in their own right, as are Darkwing Duck and both Rescue Rangers games. Heck, even TaleSpin is pretty good once you get past the first level — but Digital Eclipse’s work on the Disney Afternoon Collection also serves as an example of what passionate retro gamers can do if they have the resources and support of the original license holders. “We think about this stuff more than most humans should think about it,” Cifaldi says. “How to like, accurately present an NES game. That’s just our passion, that’s where we come from.” The passion shows.
Facebook details its plans for a brain-computer interface
Facebook wants you to use your brain to interact with your computer. Specifically, instead of using something primitive like a screen or a controller, the company is looking into ways that you and I can interact with our PCs or phones just by using our mind. Regina Dugan, the head of Building 8, the company’s secretive hardware R&D division, delved into this on stage at F8. “What if you could type directly from your brain?” she asks.
In a video demo, Dugan showed the example of a woman in a Stanford lab who is able to type eight words per minute directly with her brain. This means, Dugan says, that you can text your friends without using your phone’s keyboard. She goes on to say that in a few years time, the team expects to demonstrate a real-time silent speech system capable of delivering a hundred words per minute. “That’s five times faster than you can type on your smartphone, and it’s straight from your brain,” she said. “Your brain activity contains more information than what a word sounds like and how it’s spelled; it also contains semantic information of what those words means.”
And that’s not all. Dugan adds that it’s also possible to “listen” to human speech by using your skin. It’s like using Braille, but through a system of actuators and sensors. Dugan showed a video example of how a woman could figure out exactly what objects were selected on a touchscreen based on inputs delivered through a connected armband. The armband’s system of actuators was tuned to 16 frequency bands, and has a tactile vocabulary of nine words, learned in about an hour.
This, Dugan says, also has the potential of removing language barriers. “You could think in Mandarin, but feel in Spanish,” she said. “We are wired to communicate and connect.”
Of course, a lot of this tech is still a few years out. And this is just a small sample of what Dugan has been working on since she joined Facebook in April 2016. She served as the 19th Director of the United States’ Defense Advanced Research Projects Agency and she’s the former head of Google’s Advanced Technology and Projects group (no big deal).
The stuff that Facebook is creating at Building 8 is modeled after DARPA, and would integrate both mind and body. “Our world is both digital and physical,” she said, saying there’s no need to put down your phone in order to communicate with the people in front of you. That is a false choice, she said, because you need both. “Our goal is to create and ship new, category-defining consumer products that are social first, at scale,” she said. “We can honor the intimacy of what’s timeless, and also create products that refuse to accept that false choice.”
In a Facebook post, CEO Mark Zuckerberg states: “Our brains produce enough data to stream 4 HD movies every second. The problem is that the best way we have to get information out into the world — speech — can only transmit about the same amount of data as a 1980s modem. We’re working on a system that will let you type straight from your brain about 5x faster than you can type on your phone today. Eventually, we want to turn it into a wearable technology that can be manufactured at scale. Even a simple yes/no “brain click” would help make things like augmented reality feel much more natural.”
Click here to catch up on the latest news from F8 2017!
Source: Facebook
Sling TV streams live broadcasts to your LG Smart TV
If you’re a cord cutter, you no longer have to worry about buying a dedicated media player just to watch live broadcasts on your LG TV. Sling TV’s internet-only service is now available on “most” of LG’s 2016-era webOS sets as an app, with 2017 models due to get it in the months ahead. In theory, that puts internet -only viewing just a launch bar shortcut away.
This isn’t Sling’s first TV integration, but it’s an important one. It’s helping to create a future where you can assume that your TV set can handle any kind of live broadcast, whether it’s online or from a cable box. And of course, it’s important for Sling as well — this opens the doors to people who might not even realize that Sling TV exists, or aren’t willing to pay for a separate device just to get live channels.
Source: Sling



