Will Galaxy S8 be Samsung’s first phone with a Force Touch display?
Samsung is really trying to come up with new ways to lure in customers.
Despite the major fiasco it caused with the Samsung Galaxy Note 7 and its faulty, exploding battery, which involved two major government recalls and a permanent end in production, Samsung is pushing forward with the next version of its flagship: the Galaxy S8. This smartphone is rumoured to feature many bells and whistles, including an edge display for both models, and according to a recent report, a Force Touch-like display.
If true, the Galaxy S8 would be the first smartphone from Samsung to come with pressure-sensitive technology, which Apple refers to as “Force Touch” or sometimes “3D Touch” in its iPhones. An anonymous source told Korean news publication The Investor that Samsung is hoping to adopt Apple’s Force Touch technology partially for the Galaxy S8, but full adoption in the range won’t come for one or two more years, apparently.
Force Touch-like technology is based entirely around a pressure-sensitive display, which enables an extra level of functionality based on how hard you press it. Press and hold on an app icon, for instance, and a menu will appear with options, such as compose a message or take a selfie or whatever is relevant. Apple has been using the feature since the launch of the iPhone 6S in 2015, as well as the Apple Watch and newer MacBooks.
Other rumoured features for the Galaxy S8 include two cameras and a dual-curved screen. Check out Pocket-lint’s rumour roundup for more details on what the upcoming flagship might feature.
The phone will likely be announced in early 2017.
Google’s self-driving cars master tricky three-point turns
With supercomputers for brains and perfect 360-degree vision, one would imagine it would be fairly simple for an autonomous vehicle to pass the time-honored driver’s ed test of performing a three-point turn. But according to Google’s self-driving car project October report, that is one of the many things that’s easier said than done for a robot vehicle.
While a human driver can estimate the best angle and distance, the amount of data and information the autonomous vehicle has can actually be a drawback. “Our challenge is to teach our self-driving cars to choose the option that’s not only the quickest, but one that feels natural to passengers,” the report says. So, even though some turns would be easier done in reverse, for example, people prefer to travel moving forward, where they can see what’s happening. “So we’ve taught our cars to mimic these human patterns, favoring wider forward arcs, rather than a series of short movements back and forth.”
In order to get all these turns right, Google’s vehicles are doing a lot more turns than most drivers will probably ever practice before hitting the road – about 1,000 every week. And, despite some hang ups with California’s autonomous vehicle regulations, Google’s vehicles have driven over 2.2 million autonomous miles in Washington State, California, Arizona and Texas to date.
Via: CNET Roadshow
Source: Google Self-Driving Car Project October Report [PDF]
National Geographic’s ‘Mars’ is like a SpaceX infomercial
Like a prequel to The Martian crossed with an educational documentary, National Geographic’s Mars is an earnest attempt at inspiring a new generation about a manned mission to the red planet. But what’s most intriguing is how much it puts Elon Musk’s SpaceX front and center (which has already laid out its plans to get to Mars), even more so than work from NASA and other space agencies. It’s a sign of the times: Our next step into the beyond will likely involve a benevolent billionaire as much as it does the cooperation of Earth’s most technologically advanced countries.
The series, which premieres tonight on National Geographic and has the backing of Oscar winning producers Ron Howard and Brian Grazer, is a unique hybrid of different storytelling techniques. On one side, it’s a drama set in 2033 following the crew of the first human-led mission to Mars. And on the other, it’s a documentary diving into how we’ll get there. But all the way through, it’s hard not to feel the mark of SpaceX.
While the documentary portion of the show has a bevy of notable interviewees, including Neil deGrasse Tyson and The Martian author Andy Weir, it inevitably shifts back to Musk and all of his company’s work. That makes sense when you consider just how important SpaceX’s reusable rockets will be for a Mars mission. (After all, how else would the first band of astronauts get back home?) To its credit, the show doesn’t shy away from SpaceX’s failures; in fact, it features two prominent clips of the company’s Falcon 9 rockets exploding (along with the devastated reactions of its employees).
The dramatic side of the show centers around a fictional space agency, the International Mars Science Foundation, which is led by a charismatic businessman who “can sell anything.” With an engineer’s love for his spaceship, it’s hard not to see that character as Musk with a different accent. While the IMSF is a private organization, it has investors from all over the globe, which explains its diverse group of astronauts. Much like The Martian, the show presents a future where the world is working together towards a common goal: Make it to Mars and set up a colony.
Naturally, there are problems along the way. As the Daedalus crew nears Mars, they encounter thruster issues which could make it impossible for them to land safely. They survive — there wouldn’t be a show if they didn’t — but it’s an ominous start. The key takeaway? Space exploration is exciting, but it’s also fraught with peril.
Mars doesn’t have the thrills of The Martian, but its hybrid drama/documentary could entice younger viewers who wouldn’t be caught dead watching a typical science show. It’s also an intriguing big-budget move for National Geographic, a network better known for endlessly repeating low-rent nature shows.
New York City’s free gigabit WiFi comes to Brooklyn
LinkNYC’s free, gigabit-grade WiFi is all over large chunks of New York City, but there’s a conspicuous Brooklyn-sized gap… or rather, there was. The communication network has switched on its first two Brooklyn kiosks, both of them on Fulton Street in the Bed-Stuy area. Don’t worry about having to visit a small part of the borough to get no-cost internet access, though. There are nine other Brooklyn hotspots due to go online in the weeks ahead, including some near LIU-Brooklyn and Prospect Park.
Staten Island also has its first kiosk, so those in the southwestern corner of the city don’t have to cross a bridge to see what the fuss is about.
As before, the LinkNYC kiosks aren’t just about free internet access. You also have free domestic calls (including easy connections to 311 and 911) and USB ports to charge your mobile devices. While you can’t browse the web from them like you could early on, they’re potentially crucial to the homeless, travelers and others who don’t have guaranteed internet and phone service in the Big Apple.
Source: LinkNYC
Dissecting the alien language in ‘Arrival’
One of the coolest bits from from Arrival isn’t the sci-fi movie’s Lovecraftian aliens or its stunning cinematography (although, to be fair, those are both great), it’s the Heptapods’ language. Figuring out a way to communicate with beings without provoking a war is central to the first-contact story’s plot. While their spoken language is basically a series of low-frequency grunts and groans, the inky “written” version of it resembles an ouroboros that’s written and read from left to right and right to left, simultaneously.
Throughout a series of tweets recently, writer/producer Eric Heisserer explained not only how the circular speech symbols came to be, but also the “bespoke logogram analytic code” that translated the language when the cameras were rolling. “In several shots in the film, the analytics you see are working in real-time to dissect a logogram,” Heisserer writes. “Not canned CG.”
That code looked like this: pic.twitter.com/BstvUct4PQ
— Eric Heisserer (@HIGHzurrer) November 14, 2016
Bringing the language to the screen was a joint effort between designer Patrice Vermette, science consultant Stephen Wolfram — of Wolfram Alpha fame — and his son Christopher Wolfram. All told, some 100 “unique logograms with embedded words and phrases, with mutable components” were crafted for the film.
The elder Wolfram penned a lengthy blog post about the science of the movie, too. He explains the “shell” ship design of the aliens, the work he did to ensure the science was accurate while still being entertaining and the extent that his Mathematica software was used therein. He even dishes on the whiteboard full of formulas shown in the movie, and what went into ensuring its contents were probable and plausible.
Of course, both Heisserer’s tweets and Wolfram’s blog post are rife with spoilers, so if you’re trying to go into the movie completely blind you’re going to want to avoid both. Arrival is in theaters now.
Source: Eric Heisserer (Twitter), Stephen Wolfram
Google is restricting AdSense ads on fake-news sites
Google made headlines recently about the top search result for ‘final election numbers’ being patently false. Now the company is going to keep fake-news sites from using its nigh-ubiquitous AdSense program according to the Wall Street Journal. In a statement, the search giant says that as part of an update to its publisher policies, that it will restrict ad serving on websites that “misrepresent, misstate or conceal information about the publisher, the publisher’s content, or the primary purpose” of the site.
With the surfeit of questionably sourced-or-written articles that popped up during this year’s election cycle, it’s refreshing to see at least one of the largest players in tech taking responsibility and addressing how information is distributed and presented. That sound you hear is the ball bouncing around on Facebook’s court. We’ve reached out to Google for additional information and will update this post should it arrive.
Source: Wall Street Journal
Apple Testing Augmented Reality ‘Smart Glasses’ That Connect to iPhone
As part of its effort to expand further into wearable devices, Apple is working on a set of smart glasses, reports Bloomberg. Citing sources familiar with Apple’s plans, the site says the smart glasses would connect wirelessly to the iPhone, much like the Apple Watch, and would display “images and other information” to the wearer.
Apple has contacted potential suppliers about its glasses project and has ordered “small quantities” of near-eye displays, suggesting the project is in the exploratory prototyping phase of development. If work on the glasses progresses, they could be released in 2018.
Google Glass
Apple’s glasses sound similar to Google Glass, the head-mounted display that Google first introduced in 2013. Google Glass used augmented reality and voice commands to allow users to do things like check the weather, make phone calls, and capture photographs. Apple’s product could be similar in functionality.
The glasses may be Apple’s first hardware product targeted directly at AR, one of the people said. Cook has beefed up AR capabilities through acquisitions. In 2013, Apple bought PrimeSense, which developed motion-sensing technology in Microsoft Corp.’s Kinect gaming system. Purchases of software startups in the field, Metaio Inc. and Flyby Media Inc., followed in 2015 and 2016.
Google Glass was highly criticized because of privacy concerns, and as a result, it never really caught on with consumers. Google eventually stopped developing Google Glass in January of 2015.
It is not clear how Apple would overcome the privacy and safety issues that Google faced, nor if the project will progress, but Apple CEO Tim Cook has expressed Apple’s deep interest in augmented reality multiple times over the last few months, suggesting something big is in the works. “AR can be really great,” he said in July. “We have been and continue to invest a lot in this. We’re high on AR in the long run.”
Past rumors have also indicated Apple is exploring a number of virtual and augmented reality projects, including a full VR headset. Apple has a full team dedicated to AR and VR research and how the technologies can be incorporated into future Apple products. Cook recently said that he believes augmented reality would be more useful and interesting to people than virtual reality.
Related Roundup: Apple VR Project
Discuss this article in our forums
Airliner’s near miss with drone injures two crew members
The threat of drone collisions near airports isn’t just scary — it can lead to very real injuries, even if there’s no accident. Canada’s Transportation Safety Board is investigating an incident where a Porter Airlines flight bound for Toronto took evasive maneuvers in an attempt to avoid a reported drone, injuring two crew members. The exact circumstances (including the nature of the injuries) isn’t clear, but it took place near Billy Bishop Airport, an island hub right near Toronto’s downtown core. It wouldn’t have been hard for someone on the mainland to fly a drone into the path of a low-flying aircraft.
This certainly isn’t the first time there have been reports of near collisions with drones, and it’s possible that something else may have prompted the emergency change of course. However, the injuries could easily amplify calls for drone-finding systems at airports, not to mention anti-drone defenses. While the chance of a serious collision is slim, it’s clear that even a close call can be exceptionally dangerous.
Via: CBC News
Source: Transportation Safety Board
Verizon builds on IoT division, acquires ‘smart city’ tech firm
Verizon is hoping to make our city street corners smarter in the not-too-distant future.
Verizon made a move to expand it’s IoT business today, announcing the acquisition of LQD WiFi LLC., a private company based in New York that designed and developed publicly-accessible smart hubs. Branded as Palo, they are street-level kiosk-style structures that offers a range of services such as local community information, wayfinding, public safety announcements, and transit updates via 46-inch touchscreens, as well as providing free public wi-fi, security cameras, and emergency calling. The full terms of the transaction were not disclosed.

LQD Founder and CEO Randy Ramusack explained the value their Palo kiosks are designed to add to communities:
“We designed Palo, from day one, to be part of the community, offering Wi-Fi, public safety features and a unique, interactive community engagement platform. Palo’s human-scale touch screen lets users explore and connect with the local community creating multiple ways to engage, through an innovative, purposeful and curated experience.”
This latest acquisition represents another huge step towards Verizon’s goal of expanding its IoT business into the realm of developing “smart cities”. Mike Lanman, senior vice president, Enterprise Products and Internet of Things at Verizon, touched on the value LQD’s technology will add to Verizon’s portfolio in the release:
“LQD’s Palo technology hubs capture Verizon’s vision of delivering citizen engagement experiences by connecting people with their communities while providing critical security, transportation and wayfinding solutions as well as Wi-Fi capabilities. This transaction uniquely positions us to utilize our unmatched infrastructure, platforms and network at scale to deploy elegant and engaging community technology hubs that connect, inform, inspire and support people where they live, work and play.”
It might be some time before you see these Palo kiosks popping up in your town — unless you happen to live in the city of New Rochelle, New York, where LQD has a contract in place to run a pilot program to test their technology.
Using the smaller Pixel with Google Daydream is a big compromise

It’s a good idea to buy a Pixel XL over a smaller Pixel if you want to take advantage of Google Daydream.
It’s not looking good for Pixel owners who want to use Google’s new Daydream View VR headset. The phone’s 1080p resolution AMOLED display is good enough for daily use when held at a nominal distance from one’s face, but strapped into a VR headset, the individual pixels are not only visible but distracting, and our advice is to avoid using the smaller Pixel for mobile VR.
The resolution

Let’s talk about the resolution. I spent a lot of time going between the Pixel and Pixel XL in the Daydream View, and the higher density on the latter phone’s QHD display — 534ppi to the Pixel’s 440ppi — is not only obvious but essential to the enjoyment of the experience. I found myself often distracted, even in the midst of a video or game, by the individual components comprising the whole. And while those components are still somewhat noticeable on the larger Pixel XL, we’ve learned by now that the more we can minimize distraction while in virtual reality, the more immersive and enjoyable the experience.
I still enjoyed the Pixel in the Daydream View, and unless you’ve used a Gear VR with Galaxy S7 or something with an equally high density you may not be disappointed, but I have, and it was.
The field of view

The other thing to keep in mind about using the smaller Pixel is that its physically smaller screen makes the field-of-view ever-so-slightly narrower, lessening by a few millimeters the amount of space you can see in virtual reality. In, ahem, reality, it’s not that big a deal, and you’re unlikely to notice it unless, as above, you’re coming from the Pixel XL, but keep these things in mind.
Light bleed


This is a problem for both Pixels, but the smaller unit does let in noticeably more light than the larger one. It’s not necessarily the bottom area that leeches in additional light, but the overall surface area of the phone being smaller that, in a well-lit room, makes the experience slightly more distracting.
That’s easily avoided by lowering the ambient light around you — turn off the bulbs, lower the blinds, close the curtains — but if you’re in an environment like an airplane where you can’t control these things, the Pixel XL is a superior choice.
Battery life and heat

The Pixel XL does have a larger battery than the Pixel, but due to its extra resolution doesn’t necessarily achieve additional uptime. I typically got three to four hours of constant use from each unit, which is pretty darn impressive for a mobile VR suite, and unlike the Gear VR, the phone’s charging port is open while lodged in the Daydream View. It’s likely that future Daydream headset designs will also be required to allow for concurrent charging, too.
Both phones get very hot while inside the View, but it did seem that the smaller Pixel was slightly warmer to the touch. I don’t have a thermometer to register the absolute difference between the two, but I’d say that the Pixel was some five to 10 degrees Fahrenheit toastier than the XL.
Other considerations

Obviously, the Pixel XL is a sizeable jump in price from the Pixel — $120 USD, and more in some markets. That, plus the $79 for the Daydream View itself, is no small price to pay. If it comes down to using the combination of a Pixel and Daydream together, or nothing at all, of course I’d choose the former: the experience is amazing regardless of which Pixel you’re using, and Google has done an amazing job ensuring that either model works seamlessly within the View headset.
There’s also the question of color: is it better to use a black or white Pixel with the Daydream View? I got a chance to use both colors and found little difference between them; the rumored additional reflectiveness of the white phone face did not manifest itself in real-world use, especially when the ambient light was low.
At the end of the day, Daydream and its first headset, Daydream View, are remarkable achievements, and must be tried even if you’re familiar with the Gear VR, Cardboard or any other mobile VR solution. Daydream is fun; it is effortless; it is flawed, in a good way. I love it. Even with the Pixel, which is distraction central.



