You likely know that Microsoft packed a lot of improvements into the Windows 10 Creators Update, but there are still a few surprises left… particularly if you use Windows’ built-in navigation app to get from A to B. Microsoft has detailed Creators’ numerous Maps upgrades, many of which revolve around planning trickier routes. For one, you can create multi-stop routes. That’s nothing new if you use Google Maps, but it’s a big deal any time you use Microsoft’s default tools.
You can also create collections (say, all the destinations you want to visit on vacation), and take advantage of Windows’ pen support to jot down annotations. And if Microsoft or its mapping partners have made a mistake, you can suggest corrections to save others from going astray.
Some of the big updates are more passive in nature. You now have Maps access on the Xbox One and HoloLens — it’s not just for PCs and phones. The road view now accounts for 3D terrain, so it’ll be obvious when you’re going to wend through a scenic valley. You’ll also see place details on the desktop just by hovering over a point of interest, and a dark mode will save you from going blind when returning home at night. It’s easier to check traffic, too. Mobile users, meanwhile, now have a direction-sensitive view that can help when trying to get your bearings. Again, many of these features aren’t completely novel. They might, however, get you to try Maps when you’d otherwise be tempted to use alternatives.
Source: Windows Experience Blog
Commuter trains are already somewhat eco-friendly by their nature (you’re less likely to need a car, after all), but the San Francisco Bay Area’s train system is taking things one step further. BART (Bay Area Rapid Transit) has unveiled a policy that will gradually move it to completely renewable energy. It starts off modestly by limiting CO2 emissions now through 2024, but the plans will be more aggressive after that. At least half of its energy will have to come from renewable sources by 2025, with 90 percent of it from low or zero-carbon sources. All of it will have to be zero-carbon by 2035, while complete reliance on renewable sources would come by 2045.
This isn’t exactly an overnight revolution, then. However, BART notes that it would actually outperform California’s plans for a standard of 50 percent renewable energy use by 2030. Also, any improvements will likely make a tangible impact on the state. BART uses more power than the entire city of Alameda (over 400,000MW/h per year) — even that 2025 target might help a lot. It’s also important to note that BART expects to run both trains and its infrastructure on green energy sources. The area’s Caltrain service has already made its own pledge to use renewable energy, but it’s still using diesel trains where BART’s vehicles are completely electric.
Only some of this will come through in-house energy generation (primarily through solar power), since BART just wouldn’t have the capacity to meet all its own demand. Most of it will come by purchasing energy from the grid. There will be a certain point at which you can ride the train largely guilt-free, however, and whatever BART accomplishes might help other transportation networks achieve their own renewable energy goals.
Via: Greentech Media
The interplay between marijuana and technology is a story that’s been told a million times, but for some reason, everyone always leaves out the middle.
Generally speaking, the beginning and end of the process get all the attention. There’s the growing side of the story, with all the high-pressure sodium lights, variable wavelength LEDs, and hydroponic growing systems — and then there’s the consumption side: the concentrates, the vaporizer pens, and the Uber-style startups that will deliver a blunt directly to your doorstep at the push of a button. These are the stories you usually hear when the topic of cannabis tech comes up.
But there’s a story between those steps that nobody ever talks about: The scientific testing that takes place somewhere between growth and consumption. It’s a hugely important part of the process, but because it takes place behind closed doors, it’s also shrouded in mystery.
Marijuana needed to be tested for potency, pesticides, and a variety of other things depending on what state you happened to be in.
Before legalization took off in the United States, cannabis testing wasn’t mandatory. There was no regulation, so the only people who sought out testing were those curious and well-funded enough to find out how potent their crops were. But when legalization came, so did regulation. In order to be sold through the proper legal channels, marijuana needed to be tested for potency, pesticides, and a variety of other things depending on what state you happened to be in.
This change immediately led to a huge boom in demand for marijuana testing services. Suddenly, it wasn’t a matter of choice; it was something that every grower was required to do by law before selling their wares — so testing labs quickly sprang up to fill that need. Nowadays, there are hundreds of them scattered across the country, mostly in states where both medical and recreational marijuana use have been legalized.
But what exactly goes on in those labs? How exactly do they conduct their testing? And what do they do with all the data they collect? To answer these questions, Digital Trends went on a field trip to ChemHistory — one of Oregon’s biggest cannabis testing labs — to get the skinny on how everything goes down.
Grinders and beakers and test tubes, oh my!
The process starts off just like you’d expect: a sample comes in for testing, then ChemHistory enters it into a lab information management system on a computer and gives it a unique identifying number.
Once it’s in the system, the science begins.
Step one is to take a “representative sample” of the cannabis product — as objectively as possible. To do this, the sample is placed on a grid, and an algorithm dictates which regions of that grid will be used for the test. This prevents human bias from entering the experiment, and prevents lab technicians from selecting the most visually appealing nugs. The weed is then photographed and ground up — with it’s own grinder, of course, to ensure there’s no cross-contamination between samples.
Then it starts to get complicated. The next step is to weigh out a predetermined amount of the ground up sample, then use a specific chemical solvent to separate the “potency” from the “matrix”. In this example, let’s say the potency is THC, and the matrix is the plant matter that the THC is clinging to. Before the THC can be measured by ChemHistory’s lab equipment, it first needs to be pulled out of the shrubbery it arrived on. This is achieved with a solvent, which is pumped into a test tube along with the ground-up marijuana.
Once the potency has been liberated from the buds and left floating in the solvent, it’s then diluted and placed in a small vial. This vial is then loaded into a robotic “auto-sampler” that feeds the diluted sample into a gas or liquid chromatography machine.
Suddenly, it wasn’t a matter of choice; it was something that every grower was required to do by law
We wont get into the specifics of how this machine works — just know that it effectively separates out the different molecules in a sample based on molecular mass. Think of it like a pack of runners spreading out over the course of a race based on speed and stamina. Molecules do the same thing in this machine, which allows them to be analyzed separately in the next step of the process.
Once separated, the next step is to figure out the identity of each molecule. This is achieved with the help of a machine known as a triple quadrupole mass spectrometer. Again, we’ll let you peruse Wikipedia to learn how this tech works — but really, the details aren’t terribly important. The mass spectrometer essentially allows ChemHistory to determine not only the identity of a given molecule (like THC or CBD), but also the relative quantity of that molecule in relation to the sample.
But the process doesn’t end there.
More than just a label
If you think that all of this information just gets printed on a label and slapped on the side of a nug jug, you’re dead wrong.
That might’ve been how it worked a few years ago, back when the industry was young and labs were analyzing weed simply because the government said testing was necessary to protect consumers — but lately, entrepreneurs have realized that all this cannabis data can be used for much, much more.
All of the data produced by ChemHistory — and dozens of other labs around the country — is funneled into a software platform called Confident Cannabis. It’s a free lab information-management suite built specifically for marijuana-testing labs, and as such, there’s a large number of labs that use it.
Collectively, these labs record an absolutely massive amount of data about the weed they test, and Confident Cannabis analyzes this information to find insights, averages, and trends. It’s not just THC and CBD concentration, either — labs also collect information about the strain of marijuana plant, the grower who produced it, the region it was grown in, and the terpenes (flavor/aroma molecules) present in each sample.
This data is beneficial for everyone involved. Growers of high-quality weed can use these lab reports to fetch a higher price when selling their product to dispensaries. Dispensaries, in turn, can use this data to put better products on their shelves. And of course, consumers can use this information to make more informed purchasing decisions.
Need something that’ll get you high and giddy, but not so blazed that you stare at the refrigerator for three hours? What about something that has a lemony flavor, and also has pain relieving effects? Thanks to all the scientific data that labs have been gathering over the years, it’s now easier than ever to find cannabis products that are perfectly suited to your specific needs.
It’s easy to dismiss lab testing as a strict, government-imposed step that marijuana growers follow simply because they have to — but lab data is no longer just about protecting consumers from smoking pesticide-laced pot. Information is power, and startups like Confident Cannabis are using that power to foster healthy competition between growers, boost the quality of the marijuana in available to consumers, and ultimately drive the cannabis industry forward as a whole.
That massive Google Docs phishing attack from May 3rd was more than a little disconcerting, but Google is trying to set minds at ease. It just outlined how it responds to this email trickery — including how it intends to prevent incidents like the one that just wreaked havoc. It’s shoring up its defenses by tightening its policies on third party authentication (the Docs attack steered users toward a bogus app using a Google sign-in), refining its spam filtering to target Docs-style campaigns, and more closely monitoring apps that ask for your data.
The internet pioneer already has a number of anti-phishing measures, such as machine learning-based detection, Safe Browsing, email attachment scanning and extra security measures for suspicious-looking logins.
Google stresses that the May 3rd attack didn’t do much damage. Fewer than 0.1 percent of its users were affected, it says. That’s somewhat comforting, but it was still hard to escape the phishing campaign for the brief hour it ran rampant. Even that small a portion of Google’s user base still represents many, many people. The stepped-up anti-phishing efforts might be necessary to prevent another large-scale mess.
Source: Google Security Blog
Apple Maps Vehicles Begin Surveying Connecticut, Imagery Could Aid Apple’s Autonomous Driving Efforts
Apple has updated its website to indicate that its Apple Maps vehicles will begin surveying Connecticut for the first time this month.
For nearly two years, Apple has been driving vehicles around the world to collect data for Apple Maps—widely believed to be street-level imagery. Since 2015, the vehicles have surveyed over 30 states in the United States, in addition to parts of the United Kingdom, Italy, France, and Sweden.
Apple said it will blur faces and license plates on collected images prior to publication, suggesting that it could be working on adding a Street View feature to Apple Maps, similar to what Google Maps has offered for several years. But, the imagery and other mapping data could be used for a variety of purposes.
When Apple’s fleet of Dodge Caravans first hit the streets, it was speculated they could be the basis of an Apple Car. But those rumors quieted down after the vans were labeled with Apple Maps decals, and because Apple has shifted towards autonomous driving software, rather than an entire vehicle, at least for now.
Moreover, the California Department of Motor Vehicles confirmed that Apple is using a fleet of Lexus SUVs, which have since been spotted on the road, to test self-driving software. It’s known that Apple’s platform currently uses a Logitech wheel and pedals, and drivers can take over manually if necessary.
Nevertheless, so-called Apple Maps vehicles could still be playing a role in the company’s autonomous driving plans.
Neil Cybart, an independent Apple analyst at Above Avalon, told MacRumors that Apple Maps vehicles are “very likely capturing mapping data,” such as street level imagery, that will aid Apple’s autonomous driving efforts.
I don’t think these Apple Maps vehicles are just meant to improve Apple Maps. Instead, my suspicion is they are part of Project Titan. Specifically, the vehicles are likely playing a role in building the groundwork for Apple’s autonomous driving technology. The data collected by these vehicles may be used for testing autonomous driving technology using indoor simulation.
Cybart, who confirmed seeing an Apple Maps vehicle in Connecticut earlier this week, said the mapping data collected could be a “foundation” for Apple’s autonomous driving technology platform.
Apple Maps vehicles are not autonomous cars. Instead, they are very likely capturing mapping data (i.e. imagery) that will aid Apple’s autonomous driving efforts. My view is that this mapping data isn’t just for Apple Maps Street View, which wouldn’t be too useful, but rather for building a mapping foundation for Apple’s autonomous driving technology platform.
Connecticut and many other states that Apple has surveyed don’t currently allow autonomous vehicle testing on their public roads, so Apple very likely is collecting data only, as it says. Whether that data is used for a Street View feature, autonomous driving software, or both, remains to be seen.
Related Roundup: Apple Car
Tags: Apple Maps, Neil Cybart, Apple Maps vehicles
Discuss this article in our forums
A team of scientists have snipped away HIV DNA from the genome of live mice using a CRISPR system, and the rodents lived to (kinda) tell the tale. It’s still much too early to call the method a possible cure, but the fact that it worked on a living animal opens up a lot of possibilities. Will it work on other diseases, like cancer? Maybe, but that’s something scientists have to look into. These researchers headed by neurovirologist Kamel Khalili have been focusing on the use of the gene-editing technique to eliminate HIV for years. They successfully excised HIV DNA in live mice last year, but this round is a lot more thorough.
The team used three different types of mice for their most recent experiments. First type is infected with latent HIV lurking in its cells, the same type they used for their trials in 2016, while the second one has actively replicating HIV in its system. The last type is grafted with human immune cells, including T cells that have latent HIV in them. HIV can actually hide in the immune cells that are supposed to kill them, which is one of the reasons why it’s hard to find a cure for the condition. While anti-retroviral therapies allow HIV patients to live a normal life, they’re far from perfect. As Khalili explained:
“The current anti-retroviral therapy for HIV is very successful in suppressing replication of the virus. But that does not eliminate the copies of the virus that have been integrated into the gene, so any time the patient doesn’t take their medication the virus can rebound.”
To introduce their CRISPR system into mice without being attacked by its immune system, the researchers attached it to adeno-associated virus that doesn’t typically trigger a response in the body like other viruses do. The system eliminated the rodents’ HIV as the AA virus replicated. While successful, the team still has to make sure their system only snipped away the HIV and none of the good DNA. That’s why they’re planning to repeat the experiments on primates, whose DNA is closer to humans. Ultimately, they hope to follow in the footsteps of the Chinese oncologists from Sichuan University who already conducted CRISPR human trials.
Source: Molecular Therapy, Temple University
Before the release of consumer-grade virtual reality hardware, HTC wasn’t anywhere near the household name that Oculus was, but since then it’s arguably become the most recognizable brand for high-end, PC-based virtual reality. Its Vive headset is still the one we would recommend for those looking for a cutting-edge VR experience and it has a lot of content to go along with it.
Now a year on from its original release, the headset is a little lighter, has a new audio-strap in the works and HTC has expanded its software base with Viveport. It also offers a monthly subscription service for game delivery and even partnered up with a number of non-gaming firms to develop uses for virtual reality outside of the home-PC.
To find out how HTC feels this first year of commercial virtual reality has gone and what it has planned for the future, we organized a digital sit down with Daniel O’Brien, the general manager of HTC’s Vive division, who told us how the last 12 months have been.
Digital Trends: How happy have you been with the uptake of the HTC Vive and its software over the past year?
Daniel O’Brien: We’re really happy with the volume of content and with its monetisation. We’ve seen many of our developers making over a million dollars in revenue and a number of small teams experimenting with VR making over $25,000. What that shows us, is there’s good traction towards what we thought would happen in the first year.
We met our [headset sales] goals in year one and we’re on track for year two. We’re holding a very good position with retailers and we’re expanding our relationship with them in 2017, where we feel that some of our competitors are going in the other direction.
We’ve now reached 1,500 content pieces for the Vive on Steam. Although there are a lot of tech demos in there, I believe that some of the best VR experiences available today can be found on Steam and they represent the true promise of what VR is.
You also launched your own content store, Viveport last year. What has the uptake been like for that?
“We’re really excited about having a store that can have a non-game focus.”
Viveport is in its infancy and is just beginning its first year of a new content store for distribution. We’re really excited about having a store that can have a non-game focus. Although gamers are going to be the main audience for VR for another year or so, by giving another marketing avenue for non-games, it’s actually working out really well for us. It’s expanding the ecosystem for the kinds of content you can actually promote.
You’ve also recently launched a subscription model for Viveport, where users can pay $7 a month to have access to five applications on a rotating basis. What has the response been like to that?
That’s actually got a really high attach rate right now, which is something we’re really excited about. It gives our developers a new revenue channel and is a low risk way for consumers to try out different content pieces.
What are some of your favorite games that you’ve seen come to the Vive over the past year?
It’s hard for me to pick an outright favorite, because I always get to see the next big thing that’s on the horizon that hasn’t commercialized yet.
My favorite right now is the Rick and Morty game. It’s so engaging, but I’ve only had a chance to play maybe an hour or so. I love the puzzle games like The Gallery. That’s always been a favorite of mine, even before we launched the Vive. Google Earth is something I spend a lot of time in.
I’m also really enjoying the new innovations that are happening to locomotion in games too, like what Survios has done with Sprint Vector. We actually have a leaderboard for that game in our house and my son is beating me handily.
What are some of the new controller types and innovations you’ve been impressed with using the Vive tracker?
While we’re seeing obvious innovations like the gun and the sword, what we’re really excited about is the wide variety of solutions. Full body tracking is something that’s doable with the tracker. All you need is three trackers, the Vive and the controllers and you literally have your whole body tracked in virtual space.
We’ve also seen knuckle controllers, phone’s that can see into VR. We’ve seen a lot of experiments in the input area.
What’s important about the tracker though, is that it’s removing thousands of man hours of engineering and research and development to build their own tracked peripheral or object.
The movie and entertainment industry is now starting to use these trackers rather than motion capture as part of their development tool set too. While I can’t confirm which AAA studios are using the Vive, I can tell you that every single one of them is developing with trackers and looking at innovative ways for their IPs.
When do you think we’ll see the first AAA games start to appear on the Vive?
We’re very excited about Valve committing to VR with the three titles it’s making for it and Bethesda announced its Fallout strategy. We do know of other partners that are working on various large IP projects with the Vive and they just haven’t announced those yet.
I think the second half of 2017, I think we’ll see a lot more announcements and a lot of projects on the horizon. We’re going to start to see stuff that’s more relevant to that later this year and in 2018 there will be a lot more content coming.
We would expect that to be a turning point for the industry, where until those big experiences and IPs occur, hardware revenues will continue to outpace software revenues. As we get closer to 2020, we’re going to start to see content revenue outpace hardware. That’s a pretty normal path for what we think is going to happen with this sort of industry.
Valve and HTC both offered classes and courses for developers to help get them started with development in VR. How important have those been in helping to kickstart that over the past 12 months?
I think it’s been very critical. We wanted to produce the hardware, we wanted to produce the platform to make it possible to get the content launched. I think we did a good job of doing that as a foundation for the industry.
“We want to try and keep things as open as possible as we continue to expand the developer base.”
We’re also starting to see a lot of university programs which are great, hackathons. These guys are all really proactive in their communities and in organizing events for developers to meet up and talk about the tools that they’re using and the application for them.
If you look at the mixed reality filming setups that we use at every conference and event that we go to, that was developed by the Northway Games and Radial Games teams. They gave that software away and it made it much easier for other developers to create trailers and to advertise their games. We’re still in this phase where people are openly sharing their software and their code and ideas and I think that’s really positive. We want to try and keep things as open as possible as we continue to expand the developer base.
In our original interview last year, we spoke about exclusivity of VR games and platforms and you were quite clear that HTC was staunchly against such a practice. Is that something you maintain now a year on from release?
For us it’s very important. [Being open] is the best way for developers to monetize their content. Developers need to be able to make money. We spend a lot of time working on these platforms, but if a developer can make something for the Vive and then take their game to other VR platforms, we’re completely open for that.
From a typical or historical model in the console market, a developer could be really successful with a large install base of consoles. That’s not necessarily the case with VR right now. For a developer to be hamstrung or restricted to one store or platform only, that’s really limiting their ability to generate revenue and evolve their content.
That’s why we work so hard not to close things down, because that only limits developers further. We want them to have long term health and success.
VR arcades are something that HTC has been keen to sponsor over the past year. Do you think VR has the potential to bring back the public arcade experience?
I think so. We’re seeing already, early success of that. Just looking at IMAX theaters, they’re already claiming to have run 20,000+ people through them. I went to a birthday party last weekend at a Seattle warehouse for my son’s friends and the whole place was a virtual sports zone. Those rooms were full all the time and for many people that was their first experience of trying VR.
I think VR arcades have a distinct ability to attract a consumer and get them into VR and get them excited about it. This is a model that already exists in markets like China, for PC gaming and we’re now seeing that same sort of public setting for VR show up in Europe and the Americas. People have some really great and innovative ideas about how to do it.
What we’re really working on now, is how developers and arcade owners can monetize those experiences and not have to necessarily make their own deals with each other. They don’t necessarily have the relationships like we do, so we’re acting as a path for that.
The one thing that didn’t exist in the original arcades that does today, is a gaming audience. The kind of people that like to watch other people play games. Today you have these global leaderboards, you have Esports audiences, streamers, entertainment companies getting behind gameplay. I look at arcades and see a whole other method for people to get excited about what’s happening in VR.
How has the Vive been used to help product design through tools like MakeVR?
“I don’t know a single major automotive company that isn’t using a Vive to experiment with and use for design and engineering decisions.”
I can say with confidence, that I don’t know a single major automotive company that isn’t using a Vive to experiment with and use for design and engineering decisions. We’re seeing it in the architecture design industry and we’re starting to see the medical industry really take hold of VR technology.
These companies would love to not pay hundreds of millions of dollars to make mockups of new cars and products.
It’s not just about design but collaboration. You can literally take 20 of your designers and put them in the room with the design of the product and go, “ok, let’s make design decisions about this.” Instead of putting people on planes, you can now make those sorts of design decisions in VR.
You’ve continued to make changes to the Vive’s design over the past year. Why were some of those innovations not present in the original version?
On this first headset there were things we wanted to do that we didn’t get a chance to. It wasn’t that we didn’t do them because we had to get to market at a specific time, it was that we didn’t have a solution to make it better and this is what works.
We originally wanted to launch with built in audio, but the design we had at the time just wasn’t comfortable, so we decided to move forward as we did. As a phone manufacturer, we had a lot of experience with the question of, ‘do we want to launch with audio in the box or not?’ There are a lot of people who like their own audio. We know who our early adopter, ‘innovators’ are, these people are really discerning about the type of audio products they use. Yes there are some people who are going to use the in-box earbuds, but a lot of early adopters want to have their own choice of audio. Especially if they’re hardcore gamers.
Jessica Lee Star/Digital Trends
We made a conscious decision with the launch of the Vive to leave the audio strap for now and give people the option. Now we’re getting into a different consumer as we grow the consumer base and that is an audience that is more interested in being given an embedded audio solution.
As we grow the audience further, we’re growing the peripherals that go with it. We’re listening to audiences and developers and basing a lot of our design decisions off of that. If we can bring a lighter tether that makes it more comfortable mid-generation, there’s no harm in us doing that.
When it comes to major jumps like resolution, that’s new-new products.
When can we expect to learn more about the Vive 2?
We’re always continuing to listen to developers, what they think is the most beneficial next-generation improvements and that’s how we’re solving the next headset and when that will come to market. It’s not about picking a production cycle and timeline, it’s about bringing really meaningful innovation that helps the developer community to create compelling new experiences.
When third parties like TPCast are developing technology like wireless virtual reality, is that something that HTC will co-opt, or can we expect it to have its own solution with a next-generation headset?
TPCast is actually a ViveX company that we invested in and we’re helping them come to market. That is an add-on. In 2017 I feel like wireless is going to be an option for consumers. Later on in 2018, wireless will be an expected feature.
I think there’s going to be multiple wireless solutions together. We’ve seen models where partners are building their own proprietary wirelesss solutions as an add-on and would rather just design a reference design and license out the solution to HTC or TPCast. Developers who spend five or more hours in virtual reality are really looking forward to wireless and a more comfortable head strap. It just makes it easier for users to use VR. As you get closer to wireless, a lighter more versatile headset, you broaden that consumer audience and what they are ready to buy.
Have you noticed a trend over the past year that new graphics card releases aid VR adoption?
Absolutely. When you look at the adoption rate of VR, in the early days we knew there were eight to 10 million people just on Steam alone, who had a PC that didn’t need to update at all to run a Vive. We also knew that there was another base of around 15-20 million consumers who only needed a graphics card upgrade to be ‘VR ready.’ We know that the GPU is the next addressable market, so as the cost of those graphics cards comes down, as the performance of them has improved, we’ve seen an immediate positive impact on sales. Like when we bundle a Vive with a graphics card.
“Valve is an invaluable resource in terms of feedback and thoughts on future designs.”
When we look at the adoption curve, we’re moving through the innovators with the high-end systems and now we’re going through the early adopter phase. These are people who understand what GPUs, CPUs are and what they need and that’s very different than when you reach adoption in a mass consumer base.
Is augmented reality something that HTC plans to implement in the Vive in the future or address in another hardware solution somehow?
Augmented reality is something that will absolutely occur. Is it going to happen next year in a meaningful way? No, but within four to five years, for sure.
AR is still on [HTC’s] mind and as that market evolves, we will very much look to have our brand in that space as well. You won’t hear from us on that until we have something to say that is meaningful to the developer community.
What is the relationship like between HTC and Valve today?
We’ve really not changed our model of collaboration. We share concepts and designs and thoughts about future products. We worked with Valve on the trackers and the new headstrap design. Valve is an invaluable resource in terms of feedback and thoughts on future designs.
Android is secure, but your phone probably isn’t. 3.5 million pieces of malware in 2017 means that matters.
When you’re king of the hill you are a target for everyone and everything. Sometimes that’s great — LG’s G6 is an awesome phone that wants to compete with the Galaxy S8 because the GS8 will be the king of the Android hill. Other times it’s not, and security company G Data takes a look at one of those not-so-great times.
When you’re on top you are a target.
Android’s market dominance means it is the main target for people writing malware. Just like Windows for your PC, the fact that more than 70% of smartphone users worldwide use Android means it’s where you want to focus if you’re trying to steal user data. There is certainly malware for iOS, and probably Windows 10 Mobile, but to increase the odds, Android is the target.
G Data forecasts that it will see 3.5 million cases of malware for Android in 2017. A look at the numbers since 2012 shows that it’s not making an outrageous claim, either.
Image courtesy G Data.
There’s a reason why malware is successful with Android, and it’s one that still hasn’t been addressed: most phones are using old software and haven’t been patched against it.
Google does a lot of work to make Android secure and keep it that way. It pays people to find security exploits, works with hardware vendors like Qualcomm or NVIDIA to fix them if needed, then writes a patch that can be injected into the existing version with no fuss. If you have a Pixel or Nexus or BlackBerry product, you’ll then get these patches. If you have any other phone you roll the dice and hope the people who made it care enough.
More Androids run Gingerbread (2010) than run the current version.
Forget about the Pixel or a Nexus for a minute. They have to be updated because there is no way Google can say that these updates are really important if they aren’t. Google may be silly sometimes, but not that silly. But BlackBerry? It’s hard for me to imagine any scenario where you can set the bar lower than using BlackBerry as the example.
BlackBerry (the software company from Canada) is a company that operated a month away from bankruptcy for a year or so and has found a way to stay afloat and reinvent itself. It’s not in the black (pun intended) because it can ship a security patch 30 days after it received it. Security may be BlackBerry’s “thing” but as far as resources, Asian phone manufacturers dwarf it. My take is that it does it because it has found a way to streamline the process and not have to spend hundreds (or more) man-hours on the patches. And whether a model sells a million units or 50 million units, you’re only writing one patch.
Android 7.1 is on 0.5% of the 1.5 billion+ Android phones that are in use worldwide. The number with the May 2017 patch is likely to be close to this because the only phones that have it are running 7.1. And remember, the company that made your phone has had that patch for at least a month before it was released. Even worse: more phones are running Android 2.3.3 — which was released in 2010 — and no longer see any security updates than are running up-to-date software.
Not everyone wants one of these.
Real talk: there has not been a security apocalypse for mobile devices. Yet. But this is a recipe for one, and it could happen tomorrow. Isn’t preventing a massive data breach that affects millions and millions of us better than crossing fingers and hoping it doesn’t happen? Not everyone wants a “boring ass” Pixel or a BlackBerry. People want the things a Galaxy S8 or LG G6 give them. One of those things needs to be a little protection against the shitware that very smart people are making and looking for ways to give to everyone.
Security updates need to become a feature along with a great camera and slinky glass body.
Usually, security companies write blog posts to push their products and a specific agenda. While G Data’s post may serve to those goals it also highlights the very real and very serious problem of having software that’s easy to hack storing your credit card numbers and user passwords.
We wish there was better news here, but as usual, we can only offer the advice to be careful what you install and get all of your apps from Google play. Stay safe.
When director James Gunn revealed that Guardians of the Galaxy Vol. 2 would be the first movie shot with Red’s 8K Weapon camera, he triggered a bit of speculation: what prompted the move beyond the incredibly high resolution? You might have a better answer today. Red has posted a behind-the-scenes look at the movie that, to no one’s surprise, talks a lot about why the Vol. 2 team shot with such relatively exotic gear. And no, it’s not just about that picture quality.
As director of photography Henry Braham summarizes: the Weapon is a “large format” camera that’s simultaneously “tiny.” That let the crew shoot very detailed imagery regardless of the shot — important for a CG-heavy movie, since it maintains a consistently sharp look. They could use the same cameras for handheld close-ups or unusual rigs, such as a spider rig that flies along a wire. In short, they didn’t have to switch cameras or resort to convoluted setups.
The behind-the-scenes video is undoubtedly a puff piece meant to sell you on both the camera and the movie. However, it’s also a hint as to where movie technology is going. You can expect 8K digital cameras to become more commonplace, of course, but they also promise more elaborate cinematography that might have been difficult just a few years ago.
Seemingly every connected device has at least one wireless radio in it. However, that often requires some big compromises. Those radios often chew up a lot of power, which isn’t always practical with Internet of Things gadgets that may not have much room for a battery. Disney Research may have a solution to that problem: ditch the radios entirely. Its scientists have developed technology that uses ambient radio waves to communicate. The approach uses very low-power sensor nodes to reflect radio waves from virtually any background source, whether it’s a distant tower or the phone in your pocket. RFID tags already use a similar approach, but the use of many more sources and multiple channels gives you a much longer range — in tests, the researchers achieved data links at distances up to 164 feet.
Naturally, you won’t see this in every device you own. You need other wireless signals for this to work, after all. That also rules out using the hardware in remote areas where there are few if any ambient radio waves. Disney’s invention might be very useful in smart homes and other electronics-heavy places, though. At the least, it could lead to slicker devices that don’t need big batteries. And at best, it could lead to smart devices that weren’t even possible before due to size or power requirements. You might also see more connected electronics that run primarily or exclusively on solar power.
Source: Disney Research