Top Amazon exec Morgan Wandell jumps over to Apple Video
Why it matters to you
Apple is entering the realm of streaming original series, which could lead developers to increase quality of programming to compete.
Morgan Wandell, a top executive at Amazon since 2013, has been lured away by Apple as part of its bid to increase scripted programming in the streaming market. With Apple ready to drop as much as $1 billion to develop original programming, according to Business Insider, they hope to become a major player in the industry.
Wandell arrived at Amazon from ABC Studios, and he has an impressive resume. He helped develop such series as The Man in the High Castle, Jack Ryan, Sneaky Pete, and The Marvelous Mrs. Maisel. While at ABC, he worked on such TV series as Gray’s Anatomy, Lost, Desperate Housewives, Private Practice, Criminal Minds, and Ghost Whisperer.
The Hollywood Reporter notes that Wandell joins a team that includes co-heads of video programming Zack Van Amburg and Jamie Erlicht, who were recently poached from Sony Pictures Television
Apple’s first few attempts at original programming, such as Carpool Karaoke and Planet of the Apps, were underwhelming, to put it mildly. It recently scrapped plans for an Elvis Presley miniseries produced by the Weinstein Company, amid the fallout from sexual harassment allegations.
New moves at the executive level, however, indicate that Apple wants to position itself as a serious contender in streaming original series as part of an effort to make Apple Video a destination for documentaries and shows. According to the Wall Street Journal, they’ve partnered with Stephen Spielberg’s Amblin Studios for a reboot of the anthology series Amazing Stories.
Although the hire has been in the works for months, it’s still a blow to Amazon, who Jeff Bezos wants to come up with “the next Game of Thrones.” Other tech companies want a piece of the pie as well — Facebook may be dropping a wad of cash to develop its own scripted series.
For comparison, HBO spends about $2 billion per year on original programming, Amazon spends approximately $4.5 billion, and Netflix around $7 billion. Apple’s $1 billion could fund as many as 10 television shows, with Erlicht and Van Amburg overseeing a cohesive strategy that the company has lacked thus far.
Will all these new moves pay off with the next House of Cards or The Handmaid’s Tale? We’ll have to watch and see.
Editor’s Recommendations
- Apple original programming gets $1B boost in bid to take on streaming giants
- Facebook original programming may be coming soon, according to a new report
- Amazon wants the next ‘Game of Thrones’ says CEO Jeff Bezos
- Amazon acquires rights to James Gunn’s ‘Starsky and Hutch’ TV reboot
- ‘Scandal’ creator Shonda Rhimes ditches ABC for Netflix in new multi-year deal
From the Editor’s Desk: Android 8.1 and Oreo’s AI future

Pixel Visual Core and Mate 10 event offer clues about what’s next for Android and AI.
Google hasn’t yet (explicitly) announced Android 8.1 Oreo. But reading between the lines of two recent Android announcements, it gives us a small glimpse of the first Oreo maintenance release. Most significantly, expect a major focus on AI APIs that could bring exciting new features for Google Pixel 2 and Huawei Mate 10 owners.
For starters, we’re about due an Android maintenance release before the end of the year. Just as Android 7.1 landed (for non-Pixels) in December of 2016, an 8.1 launch before the holidays would see Google maintaining the cadence of quarterly MRs that has been established through 2017.
The first Oreo MR will be an 8.1 (as opposed to 8.0.1) mainly because of new the new APIs it’ll bring. Historically, a new API level almost always brings a 0.1 version bump for Android, for versions that aren’t also a whole new ‘dessert’ release.
How do we know there are new APIs coming? A Googler stood up on stage in Germany last Monday and said so. At the Huawei Mate 10 launch event in Munich, Jamie Rosenberg, VP of Android and Google Play, said:
“The Android neural network API will be coming to the Mate 10 in a software update early next year, and I can’t wait to see what developers do with this technology.”
New APIs? That’ll be a new Android MR, then, with a 0.1 version bump.
And for the Mate 10 series, that’ll be a very important update indeed, connecting the power of Huawei’s neural processing unit (NPU) to OS-level AI support in Android. (Currently, the company supports its own Kirin NN API, as well as Caffe2 and TensorFlow Lite.)
More: Huawei Mate 10 + Mate 10 Pro hands-on
As Google finalizes Android 8.1, expect to see new versions of some of its own apps, updated to take advantage of the neural networking APIs in the new maintenance release. Obvious candidates include Google Lens, when that eventually grows beyond the current Pixel-only preview release, as well as natural language recognition in voice search, and translation through Google Translate.
That’s all well and good for the Mate 10, which has its all-singing, all-dancing NPU to handle AI tasks such as complex image recognition around 20 times faster than a general purpose SoC. But what about Google, whose Pixel 2 phones use off-the-shield Snapdragon 835s with no such integrated AI hardware?
Well, Google’s secret weapon here could be the Pixel Visual Core. The company’s first foray into the world of custom silicon wasn’t mentioned at the launch event for the Pixel 2 phones, instead only revealed this past week as the review embargo lifted.
The chip isn’t enabled yet, but will be activated in a future software update for Pixel 2 owners. (Again, expect that to be Android 8.1.) First and foremost, it’ll enable faster HDR+ image processing in the Pixel camera app, thanks to Google’s custom silicon.
As Jerry Hildenbrand explains:
We don’t have all the details; Google isn’t ready to share them and maybe isn’t even aware of just what this custom chip is capable of yet. What we do know is that the Pixel Visual Core is built around a Google-designed eight-core Image Processing Unit. This IPU can run three trillion operations each second while running from the tiny battery inside a mobile phone.
Rather than use standard methods of writing code, building it into a finished product and then trying to manage everything after all the work is finished, Google has turned to machine learning coding languages. Using Halide for the actual image processing and TensorFlow for the machine learning components themselves, Google has built its own software compiler that can optimize the finished production code into software built specifically for the hardware involved.
The mention of machine learning and TensorFlow there is significant. As much as image processing is the focus for this chip initially, Google will almost certainly be using Android 8.1’s neural networking APIs to hook the Pixel Visual Core into the camera app. That being the case, it raises the prospect of the PVC being usable for other AI-related tasks, both visual and non-visual, in the coming year.
And this could be what sets the Pixel 2 phones apart from other Snapdragon 835 devices over the next year. Android’s AI APIs will likely work on phones without dedicated neural networking hardware, but AI apps like Google Lens should be much quicker on phones like the Pixel 2 and Mate 10, which have the hardware to back it up.
None of this is confirmed yet, so the usual pinch of salt should be applied. But Google’s announcements this past week, both direct and indirect, have given us a tantalizing first look at Android’s AI future.
Other morsels from a very busy couple of weeks in tech:
- The Huawei Mate 10 Pro is a very nice phone with an extremely good camera. I’m not the biggest DxOMark fan in the world, but I generally agree with them that it’s a close runner-up to the Pixel 2 in general photo performance. (Video, not so much.) Low-light in particular is enormously improved compared to the Mate 9 and P10.
We’ll get to this in more detail next week, but there are still a few software quirks in the build I’m using right now. EMUI 8 isn’t as big a visual refresh as I’d have liked, and Huawei’s skin still feels a little behind the times — especially next to Samsung and Google. - Biggest thing that surprised me about the regular (non-Pro) Huawei Mate 10, as I started using it some more these past few days? How hard it is to hold onto. This phone is slippery af. Between the wider proportions, lack of chamfers and oleophobic-coated glass on both sides, this thing is a bar of soap in your hand. Same great hardware and experience as the Pro, for the most part, but it’s plain to see why the U.S. and most of Europe is getting the Pro — unless you’re in love with 16:9, it’s just a better phone.
- I’m continuing to enjoy the Pixel 2 phones, and I’ve switched to the XL as my daily driver, despite the weird screen which I still think is this phone’s biggest weakness. The latest alarming development there for me: some really brutal screen burn-in that raises questions about how well the phone will age.
- That said, as I mentioned on this week’s podcast, this is going to play out as follows: Google will release a patch restoring a more saturated default color setting (as opposed to sRGB), people will think it’s a “display fix” update, and will forget about the other issues like blue color shift and shadow detail crushing. And that’ll basically be the end of it.
- Going to be interesting to see what HTC can bring to the table in a potential U11 Plus. The phone looks solid, but given HTC’s shaky carrier support, can a mid-cycle like this really move the needle?
That’s it for this week. We’ll have more Pixel goodness next week, along with our Mate 10 Pro review, and some surprises.
-Alex
Ben Heck’s oscilloscope throwdown

Let’s get started reviewing oscilloscopes! In the world of electronics engineering, an oscilloscope is crucial to helping diagnose problems with noise and data communications. Each of the five oscilloscopes reviewed in this episode, from Tektronix, Keysight and Rohde and Schwarz, have their own set of features and trade-offs. Which one do you prefer? Or are you still figuring out how to use one? Let the team know over on the element14 Community.
Awesome tech you can’t buy yet: Ultra-grippy Socks and Dirt-cheap 3D Printers
At any given moment, there are approximately a zillion different crowdfunding campaigns happening on the Web. Take a stroll through Kickstarter or Indiegogo and you’ll find no shortage of weird, useless, and downright stupid projects out there – alongside some real gems. We’ve cut through the Pebble clones and janky iPhone cases to round up the most unusual, ambitious, and exciting new crowdfunding projects out there this week. That said, keep in mind that any crowdfunding project — even the best intentioned — can fail, so do your homework before cutting a check for the gadget of your dreams.
Wiral — cable slider for GoPro
Cable cam systems are awesome. When used properly, they allow filmmakers to capture jaw-dropping shots that would otherwise be impossible. The only problem is that, more often than not, these rigs are cumbersome, complex, and extremely expensive, so they’re generally out of reach for amateurs and casual videographers. But thanks to a startup called Wiral, that might soon change. The California-based company has recently taken to Kickstarter to crowdfund the development of an affordable, compact, and simple-to-use cable cam system designed for compact cameras.
Wiral Lite, as it’s called, is a complete cable cam rig that fits in a backpack, sets up in three minutes, and accommodates a number of different lightweight cameras, including GoPros and smartphones. Oh, and did we mention it’s motorized? Once you’ve set the cable and pulled it taut, Wiral Lite allows filmmakers to drive the dolly from a snail’s pace 0.006 mph all the way up to 28 mph, shooting for up to three hours on the built-in battery. A time-lapse mode also allows for moving time-lapses at three different speeds. If you’re looking to take your GoPro footage to the next level, look no further.
NewMatter Mod-t 2.0 — affordable 3D printer
Back in 2014, there weren’t many sub-$500 3D printers floating around — but then NewMatter burst onto the scene with the Mod-t, a unique new printer with a simple design and an affordable ($399) price tag. The machine was a resounding success on Indiegogo, but like many first-generation products that are brought to life via crowdfuding, it had some problems that needed to be fixed. Fast forward to the present, and NewMatter is finally back with the new-and-improved version that addresses those issues: the Mod-t 2.0.
In place of belts and gears, the Mod-t uses a toothed build plate placed atop two perpendicular pinion rods. As these grooved rods spin, they catch the teeth on the bottom of the build plate and move it in a given direction. This configuration doesn’t boost accuracy or precision in any major way, but what it does do is simplify the overall design of the printer. Because the pinion rod setup combines the driving force of one axis with the guiding force of another, the Mod-t requires far fewer parts than it otherwise would. This makes it cheaper and easier to manufacture than most other 3D printers, and allows NewMatter to sell the printer for such a low price. You can get one one Kickstarter right now for less than $200!
SpeedGrip Socks — high-traction athletic socks
Ever since the dawn of athletic footwear, shoe manufacturers have been trying to out-do each other. If it seems like shoes get more and more advanced with each passing year, its because they do. Just take a stroll through the nearest Nike outlet and you’ll encounter everything from shock-absorbing foam to 3D printed insoles. But while the footwear industry has been so fixated on shoes, the other side of the equation — namely, socks — has largely been left behind. But NY-based upstart Storelli Sports has a plan to change that.
The company’s latest product — SpeedGrip Socks — are a clever new take on athletic socks. When paired with a set of specialized insoles (which Storelli crowdfunded on Kickstarter earlier this year), SpeedGrip socks provide outrageous amounts of traction — not between your foot and the ground, but between your foot and your shoe. This, in turn, translates to better traction, more reliable grip, and better overall performance, since your foot doesn’t slide around as much inside your footwear. Why aren’t more companies doing this?
I’m Back — digital upgrade for analog cameras
As you may or may not have noticed, film photography has enjoyed a resurgence as of late, and as it continues to claw back some of its former popularity, inventors are finding more ways to blend classic photography with digital convenience. I’m Back is the latest such invention to hit the crowdfunding scene. After finding success with a 3D printed, Raspberry Pi-powered film camera, the creators of the device are back with a clever new gizmo that transforms old film cameras into digital shooters.
Here’s how it works. Instead of popping a roll of 35mm film into your old camera, you open up the back and attach the camera to I’m Back. The device’s 16 megapixel sensor will then pick up light that passes through the cameras lens, and save it to an SD card. If you’d like you see the photo afterward, you can even connect your smartphone and use it as a display screen.
The Universe in a Sphere — Glorious desk ornament
Remember that scene from Men In Black? The one that zooms out to reveal that our entire galaxy sits inside the marble on a cat’s leash? Well if that scene stuck with you, there’s a good chance you’ll appreciate this new desktop trinket that recently popped up on Kickstarter. The Universe in a Sphere is exactly what it sounds like: a desk ornament that contains a tiny scale model of the cosmic neighborhood that we live in.
“What I did was is to take a catalog of galaxies, including our home supercluster called Laniakea, converted the XYZ coordinates and selected all of the 675,758 galaxies in a radius of 125 megaparsecs,” creator Clemens Steffin told Digital Trends in an interview. “One megaparsec stands for about 3.2616 million light years, so the cloud in my glass sphere represents a diameter of 815,400,000 light years.” Steffin next searched for (and found) a company capable of lasering in each one of these 380,000 dots, each representing an entire galaxy, into a glass sphere. After that, he launched his Kickstarter.
Editor’s Recommendations
- SpeedGrip Socks create better traction for improved speed and control
- Allbirds’ cozy and sustainable wool shoes are taking the world by storm
- Awesome tech you can’t buy yet: Treepods, robot cutters, Firefly flints
- The new-and-improved Mod-T 3D printer isn’t just better — it’s cheaper, too
- Awesome tech you can’t buy yet: Smart soccer balls, vibro razors, drum rings
Air traffic controllers may get a break from non-stop drone reports
Air traffic controllers have it bad enough managing full-size aircraft, but they face an extra headache when you throw drones in the mix. You see, controllers get calls when drone pilots want approval to fly within 5 miles of an airport — and with an average of 250 reported close encounters per month, it’s clear that some aren’t even bothering with the formalities. The FAA has clearly had enough of this, as it recently made an emergency request to bypass the usual regulations and use an automate system to approve drone flights in restricted airspace. Instead of waiting 2-3 months for clearance (or calling in at the last possible moment), you could get the A-OK within 5 minutes.
There’s no certainty that the FAA will get what it wants, but it does make a convincing case. The administration had a backlog of 6,000 pending approvals as of its request, and it expected that queue to grow to 25,000 approvals in as little as 6 months. Mate that with a drone collision in September (one of the robotic fliers struck an Army helicopter near Staten Island) and it’s a recipe for danger. Pilots frustrated with a lack of progress may be more and more likely to ignore approvals, causing chaos both for air traffic control and conventional aircraft crews.
At the same time, the streamlined process could prove a boon to not just overworked controllers, but drone operators of all stripes. Businesses that absolutely need drones won’t have to wait ages to get the green light, and individual drone owners would get the opportunity to report their flights. The main challenge is ensuring that malicious and careless drone owners don’t slip through the cracks. The last thing the FAA wants is to approve someone hellbent on flying into harm’s way, and it may be difficult to completely prevent that kind of aerial assault.
Via: Bloomberg, Wall Street Journal
Source: Federal Register
Fiber optic lines can double as earthquake detectors
You might not need an extensive sensor network or a host of volunteers to detect earthquakes in the future — in fact, the lines supplying your internet access might do the trick. Researchers have developed technology that detects seismic activity through jiggling in fiber optic lines. Laser interrogators watch for disturbances in the fiber and send information about the magnitude and direction of tremors. The system can not only detect different types of seismic waves (and thus determine the seriousness of the threat), but spot very minor or localized quakes that might otherwise go unnoticed.
Fiber-based detection isn’t strictly new, but it previously centered around acoustic sensing that required wrapping them in cement, sticking them to a surface or otherwise making sure they contact the ground (to make it easier to spot impurities in the signal). That’s not necessary with the new method — you can use existing fiber lines housed in plastic pipes. It should be considerably easier and cheaper to implement these detectors.
There are plenty of challenges to making this a reality. It’s limited by the size of the fiber network, so it could miss rural areas that don’t have much if any fiber. And the current proof of concept is a relatively modest 3-mile loop around Stanford University. It could be a much more daunting prospect to run a sensor network across an entire city, let alone cross-country. This could still be far more affordable than rolling out dedicated sensors, however, and the sheer precision of using fiber (every part of the line counts) could provide earthquake data that hasn’t been an option before.
Source: Stanford
Windows 10 now includes anti-cheat protection for games
Windows 10’s Fall Creators Update is full of changes, but one of the understated additions could make a big difference if you’re a gamer. Microsoft has switched on its previously teased TruePlay feature, which promises to protect against “common” cheats in Universal Windows Platform games. Titles that take advantage of the safeguard will both run in a protected mode and trigger a background service that watches for typical cheating behavior. If they find anything amiss, they’ll send data to the developer. You can switch off TruePlay if you’re nervous about Windows transmitting your data, but companies can limit what you’re allowed to do (playing online, for example) if you don’t have it enabled.
Of course, the dependence on UWP limits its usefuless. You’re more likely to see Valve Anti-Cheat because of Steam’s sheer dominance in the gaming world. Consider this, though: TruePlay plugs a hole in anti-cheat protection, and it helps put UWP games more on par with their Steam counterparts. A developer might be more likely to write a UWP version of a title knowing that a few bad apples won’t ruin the online experience for honest players.
Via: PC Gamer, Ars Technica
Source: Windows Dev Center
The Moon is a floating eyeball that can control and monitor your house
Why it matters to you
Home security is getting more sophisticated, with smart tech and 360-degree surveillance becoming more accessible than ever.
If you’re looking for a Smart Hub for your home — one that’s very cool and slightly creepy to boot — the new Indigogo project from 1-Ring may be the gadget for you. The Moon is a levitating 360-degree smart camera with day- and night-vision-capabilities, wireless charging, cloud storage, and numerous sensors to monitor everything that’s happening in your house.
The Moon is the first Indigogo venture from 1-Ring, and it will be available early next year. The developers have already blown past their funding goal, and still have a month left to go. “The world’s first levitating camera” rises gently and floats quietly just above its base, and it can smoothly rotate in any direction for a full 360-degree scan of its surroundings. It has voice and noise recognition built in, and it charges wirelessly, so it never runs out of juice
The Moon also integrates seamlessly with your very own Internet of Things (IoT) and your smart home devices. Cameras are becoming more and more common in smart homes, and the Moon device is compatible with Amazon Alexa and Google Home, with support for Apple HomeKit is coming soon. It uses ZigBee, Z-Wave, Bluetooth Smart, and an IR blaster to interact with your home devices.
The sphere encasing the wide-angle lens is made of aluminum, plastic, and rubber. It can detect, rotate, and begin automatically recording when it senses sound or motion in the room. It also includes a speaker and three noise-cancelling microphones so it can be used for voice chat.
You can automatically upload photos or video to a cloud storage service such as Google Drive or Dropbox. If you’re paranoid about privacy (and who isn’t these days?) you can also opt for storage on your own FTP servers or microSD card, notes Android Police.
The device is controlled remotely with the free Moon Commander app, letting you keep an “eye” on things when you’re away from the house. The camera can be removed from its base and magnetically attached to any metal surface. The wireless charge allows five hours of HD video streaming.
Sensors in the Moon can monitor room temperature, humidity, light, and carbon dioxide levels. It also includes a “Presence Simulation” tool that occasionally switches lights or other devices on or off as if someone was home.
If all goes according to schedule (this being an Indiegogo project, after all) the Moon will be available in March 2018 for a retail price of $330, although you can pre-order from its funding page for $209.
Editor’s Recommendations
- Turn your entire home into a sound system via Amazon Alexa
- ASRock wants to put its X10 router at the center of your smart home
- Alexa is more illuminating when she lives inside the C by GE Sol lamp
- Western Digital unveils iPhone backup station, home cloud storage device at IFA
- Podo Labs’ Belle Bluetooth speaker will combine up to six speakers into one
When protecting medical devices from hacks, is the cure worse than the disease?
Why it matters to you
Whether you use a pacemaker or other medical device, the convenience of connectivity with your doctor carries some new risks.
Medical technology, like other tech, has become increasingly connected in recent years. Even devices such as pacemakers are now connected to the internet. This allows doctors to monitor problems such as irregular heartbeats or failing battery life. However, internet connectivity brings with it new risks for pacemakers and other medical devices. The Wall Street Journal has reported that Abbott Laboratories has rolled out an update intended to protect pacemakers from being hacked, but some medical professionals are concerned that the risks outweigh the rewards.
Abbott warned that its newest update has the potential to cause malfunctions within the pacemakers. Since Abbott Laboratories released the update, the Food and Drug Administration received at least 12 reports of pacemakers malfunctioning during the update process. In some cases, the devices failed to update properly. Even when the devices did update correctly, there were reports that some of the pacemakers went into backup mode during the updates. In backup mode, the pacemakers are reset to default settings rather than those customized for a particular patient.
Despite the glitches, none of the reports contained any mention of serious harm being caused to the patients.
While there have been multiple reports of malfunctions during the updating process, the FDA has received no reports of any of the hacks which the updates were meant to prevent. Mike Kijewski, chief executive of MedCrypt, said that the risk was fairly low due to the fact that a hacker would have to be within close proximity of the patient in order to get to his or her pacemaker.
The low risk of hacking combined with the higher risk of malfunction has led some doctors to simply refuse the updates.
“It’s not really a risk we’re willing to take at this point,” said Bruce Lerman, chief of cardiology at New York-Presbyterian/Weill Cornell Medical Center. “We don’t feel the benefit at this point necessarily outweighs the potential risk of uploading this software.”
Abbott Laboratories pacemakers are currently in use by about 465,000 patients in the United States, and they aren’t the only medical devices at risk of being hacked. In December of the last year, the FDA released new guidelines for protecting medical devices from hackers.
Editor’s Recommendations
- 26 major iPhone 6 problems and what to do about them
- How to root Android phones or tablets in 2017 (and unroot them)
- In a world saturated in Wi-Fi, there’s still room for Bluetooth Mesh
- Senhance System is a robotic assistant that can help doctors perform surgery
- Apple ‘Find My’ service is helping hackers ransom users’ systems
Five-minute allergy test passes the FDA’s scrutiny
A few years ago, researchers from the Swiss Federal Institute of Technology in Lausanne (EPFL) started developing what they eventually dubbed the “world’s most rapid” allergy test. Now, that test has received the FDA’s approval and will start telling you what you’re allergic to in as fast as five minutes next year. Abionic, the EPFL spinoff that took over the test’s development in 2010, created the abioSCOPE platform and its accompanying single-use test capsules to be able to detect your allergies with just a single drop of blood.
After combining the blood with a reagent, the tester will place the mixture on the platform’s DVD-like mounting plate (see above) and allow it to form complex molecular complexes with the test capsules. Initial results will pop up on abioSCOPE’s screen in five minutes — the full results are due three minutes later. The system uses the integrated fluorescent microscope’s laser to check for the presences of those complexes, so you can quickly find out if you’re allergic to dogs, cats, common grass and tree pollens. Sure, the system can only test for four kinds of allergens, but at least you don’t have to undergo anything uncomfortable or invasive just to find out you’re allergic to your lawn.
Source: EPFL, Abionic (1), (2)



