How to watch F1 2017 in 4K Ultra HD
The new Formula One season is almost upon us and it promises to be one of the most exciting for years.
Not only do car rules changes promise closer races – amongst the top three teams anyway – the entire season is to be broadcast in 4K Ultra HD. It will never have looked better.
So whether you are an F1 nut or fan of 4K visuals, here’s how to watch F1 2017 in 4K UHD.
When does the F1 2017 season start?
The 2017 Formula One season starts with the Australian Grand Prix on Sunday 26 March, with practice and qualifying on Friday 24 and Saturday 25 March respectively.
As it is held in Australia, UK fans will have to get up extra early to watch the live coverage. The main race, for example, starts at 6am BST on the Sunday. Thankfully, as the clocks go forward earlier that morning, that’s better than other years where the race has kicked off 5am in the morning.
What Formula One 2017 races will be in 4K Ultra HD?
All of them!
The entire F1 2017 season will be live on Sky Sports in 4K. All races in the new calendar will be available in Ultra HD. They will also be available in Full HD for those without the right equipment to view them – with some also broadcast by Channel 4 in 1080p. Qualifying and practice will also be live on Sky Sports, but not in 4K.
These are the races on the F1 2017 calendar:
- 26 March, Australia, Exclusively live on Sky F1 (4K Ultra HD)
- 9 April, China, Exclusively live on Sky F1 (4K Ultra HD)
- 16 April, Bahrain, Live on Sky F1 (4K Ultra HD) and Channel 4 (Full HD)
- 30 April, Russia, Live on Sky F1 (4K Ultra HD) and Channel 4 (Full HD)
- 14 May, Spain, Exclusively live on Sky F1 (4K Ultra HD)
- 28 May, Monaco, Live on Sky F1 (4K Ultra HD) and Channel 4 (Full HD)
- 11 June, Canada, Exclusively live on Sky F1 (4K Ultra HD)
- 25 June, Azerbaijan, Live on Sky F1 (4K Ultra HD) and Channel 4 (Full HD)
- 9 July, Austria, Exclusively live on Sky F1 (4K Ultra HD)
- 16 July, Great Britain, Live on Sky F1 (4K Ultra HD) and Channel 4 (Full HD)
- 30 July, Hungary, Exclusively live on Sky F1 (4K Ultra HD)
- 27 August, Belgium, Live on Sky F1 (4K Ultra HD) and Channel 4 (Full HD)
- 3 September, Italy, Exclusively live on Sky F1 (4K Ultra HD)
- 1 October, Malaysia, Live on Sky F1 (4K Ultra HD) and Channel 4 (Full HD)
- 8 October, Japan, Exclusively live on Sky F1 (4K Ultra HD)
- 22 October, USA, Live on Sky F1 (4K Ultra HD) and Channel 4 (Full HD)
- 29 October, Mexico, Exclusively live on Sky F1 (4K Ultra HD)
- 26 November, Abu Dhabi, Live on Sky F1 (4K Ultra HD) and Channel 4 (Full HD)
How do I watch F1 2017 in 4K Ultra HD?
All races are being broadcast live by Sky on the Sky F1 channel for Sky Sports subscribers. However, to watch the races in Ultra HD you will need to be a Sky Q customer with the Sky Q 2TB box too.
- What is Sky Q, how much does it cost and how can I get it?
You will also need to have a 4K TV with HDCP 2.2 copy protection to watch the live 4K feed.
Like with Sky Sports’ live football matches, many of which which have been simultaneously broadcast in 4K, you will be prompted by the Sky Q on-screen EPG when on the correct channel and as the coverage starts. Alternatively, you can press the red button on your remote to view the race in Ultra HD.
There will also be a thumbnail in the Ultra HD section of the main Sky Q menu that will start the 4K version of the race.
Sadly, Sky+HD or Sky Q 1TB box customers cannot view 4K Ultra HD content so cannot watch the races in that format. You can’t watch races in 4K on Sky Q Mini boxes even if you have the 2TB main box because they are limited to 1080p maximum.
Instagram now lets you save your live videos to your phone
Instagram first revealed its take on livestreaming back in November and has continued to roll out the feature around the world in early 2017. Today, the social network announced that users can now save that live footage to their phone after the broadcast is over. Yes, the Live Stories will still disappear from the app when you complete the stream, but in the event something worthwhile happens, you now have the option to archive the footage for yourself.
There’s a new “save” button in the top right corner of the screen after you finish a live session. Instagram explains that you can only take the video with you without any of the likes, comments or other interactions. As you might expect, the file is saved to your camera roll for easy access. Facebook Live already allows users to save videos for later, so it’s not really a surprise that Instagram would add the ability to do the same. The company says the save function is now available inside both the Android and iOS versions of the photo app so you shouldn’t have to wait to use it.

Source: Instagram
Museums use CT scans to take the mystery out of mummies
Most of us have a rather cinematic view of mummies: a bandaged body rising out of a sarcophagus, stumbling toward whoever just disturbed their slumber. Of course, this could never happen and not just for supernatural reasons. Mummies are wrapped up pretty tight and are just too old and fragile to do anything. In fact, they’re often too delicate for scientists to even study them, meaning many human remains have sat in storage for more than a century. However, an exhibit making its way to New York’s American History of Natural History today not only takes them out of the warehouse, but also tells us more about the people wrapped inside, thanks to some help from modern technology.
“Mummies” was developed by Chicago’s Field Museum, which lays claim to the largest collection of Egyptian and Peruvian specimens in the US. But many of these haven’t been seen by the public since the 1893 Chicago World’s Fair. Back then, the only way to take a closer look was to physically unwrap the body, risking damage to the funereal objects and preserved remains. When X-rays arrived on the scene, they presented a boon to researchers, finally letting them peek inside. The first mummy was photographed this way in 1898, and the Field Museum’s entire collection was examined and written about in 1931. But standard X-rays are lacking in nuance: They’re limited to 2D representation, capture a lot of irrelevant data, and can’t distinguish between objects of different densities.
Computed tomography scanning addresses all these problems. The first time a mummy was examined via CT was way back in 1977, just four years after the tech appeared in hospitals. But it’s only in the past decade that the Field Museum did a deep dive into its collection. This was made possible with portable CT scanners that could be brought to its facility, rather than taking the risk of transporting specimens to a hospital or lab.

The researchers at the Field learned a lot. One body, the “Gilded Lady,” belonged to a middle-aged woman who lived during the Roman era. They found she had an overbite and tight curly hair. Another mummy whose coffin identified him as “Minirdis” turned out to not be Minirdis at all; the body was 200 years younger than the coffin, indicating that the box was recycled. The CT scans provided enough info that the scientists were able to print 3D copies of the skulls, passing them off to an artist to do full facial reconstructions that are also on display.
3D printing factors somewhat heavily in the exhibition, allowing museums like the AMNH to showcase replicas of various artifacts. This allows visitors to examine small details they might often miss when the object is trapped behind glass, as well as getting a better sense of the object’s size and weight.
Another way visitors are given an intimate look at the mummies is through a series of touchscreens that let people “virtually unwrap” them, delving down through each individual layer: skin, bones, organs and small artifacts buried with the body. This is particularly revealing when it comes to the Peruvian remains, which were often buried in “bundles” instead of laid out flat. The bodies were placed in a sitting position and then wrapped up, with objects of significance tucked into each layer. How the body was positioned and where each item was located holds a lot of useful information about that person and their culture, which would have been remained unknown without the CT scan, and completely lost if scientists ever risked unwrapping the human specimens.

Of course, CT scans can’t reveal everything. While DNA testing and isotopic sampling can provide troves of information, they still involve some physical intrusion on the remains. James Phillips, a curator at the Field Museum told me that his most-wanted technological advancement would be some way to obtain DNA without having to drill into the body. For the scientists at both the Field and the AMNH, this is the ultimate goal: To treat human remains respectfully and non-invasively, gaining as much knowledge as possible from them while ensuring they’ll still be around for a few more millennia.
“Mummies” opens March 20 at the American Museum of Natural History and closes in January 2018.
YouTube responds to allegations it censored LGBTQ+ videos
YouTube has found itself accused of effectively censoring LGBTQ+ content, and its creators, thanks to its restricted mode. YouTuber Rowan Ellis discovered that the site has been marking videos concerning gender and sexuality as inappropriate content. As such, those clips are hidden if the user is viewing the service through its restricted mode.
Outrage spread quickly, with YouTube coming under fire from prominent members of the community, like Tyler Oakley, as well as stars such as Tegan and Sara. The majority of videos in question did seem to be LGBTQ+-themed, but also, unrelated clips were also blocked. As The Guardian notes, music videos from Taylor Swift, Katy Perry and 5 Seconds of Summer were also marked as inappropriate.
A message to our community … pic.twitter.com/oHNiiI7CVs
— YouTube Creators (@YTCreators) March 20, 2017
The company tweeted a response, explaining that there was no blanket ban on LGBTQ+ videos, but specific videos that dealt with mature themes may be. Later, a spokesperson told The Guardian that YouTube regretted any confusion that arose out of the issue, and will look into the issue.
The fact that restricted mode uses community flagging, age-restricted content and “other signals,” suggests that the filtering is algorithmic, rather than biased. But YouTube’s automatically-generated systems have landed the company in hot water recently over its placement of ads next to hate speech clips. Without some reform, it’s likely that the controversies may continue to mount for the world’s most popular video destination.
Via: The Fader
Source: The Guardian
‘Zelda’ fan creates an ocarina-controlled smart home
In the real world, an ocarina is a lot less functional than the magical one Link has in The Legend of Zelda: Ocarina of Time. As Nintendo 64 lovers know, the Hyrulian hero can use his instrument to do things like manipulate the rain and switch between night and day. In our realm, ocarinas just sound nice. Allen Pan, better known as Sufficiently Advanced on YouTube, decided he was tired of playing his ocarina without mystical results. So, he did what any Zelda fan with the technological know-how would do: create a smart home setup controlled by an ocarina.
The system’s hub is basically just a microphone connected to a Raspberry Pi. The device also features a light ring that indicates when it hears the ocarina and a speaker that plays the classic accomplishment chime when it recognizes an successful input. By playing songs from the game, Pan can control several WiFi-connected devices in his home, all of which seem to be homemade.
It’s not the cleanest setup we’ve seen, but this DIY smart home is pretty clever nonetheless. The songs trigger certain actions that correspond with their functions in the game, making for an intuitive system. For instance, “Sun’s Song” turns on the lights, “Song of Storms” controls the humidifier, “Bolero of Fire” cranks up the heat, and so on.
This down-and-dirty setup may not be as easy to use as Alexa or Google Home, but it looks like a lot more fun. As The Verge points out, though, this probably isn’t the most secure IoT device. Security is a problem even with more established voice-controlled gadgets, so proceed with caution if you’re thinking about recreating this system for yourself.
Via: The Verge
Chaos leads to stronger carbon fiber
Carbon fiber is widely used in aircraft and performance cars thanks to its light-yet-strong nature, but it’s still a fuzzy science. What’s the ideal baking temperature, and the resulting degree of chaos in carbon atoms, that you need to make the sturdiest material? MIT might finally have an answer. Its researchers have discovered a link between the random order of carbon atoms in a baked resin and its consequent density and strength. And by doing so, they’ve found an ideal baking temperature that makes the carbon as random (and thus as light and strong) as possible.
The team found that the inherent disorder in this particular resin (phenol-formaldehyde) formed 3D connections that are not only tough to break, but only require a fraction of the usual carbon atoms to create their structure — you could use much more of it in a vehicle and still have a lighter design. If you bake the resin at any temperature higher than 1,832F, the atoms become more orderly and weaker.
It’s easy to see use cases: you could see more efficient aircraft and cars that travel further and faster without compromising their strength. And the researchers see even more potential in the long run. You could get even more resilient carbon fiber by wrapping it in other structures (such as nanotubes), or take advantage of the weight savings to include sensors.
Source: MIT News, ScienceDirect
eBay takes on Amazon with guaranteed 3-day shipping
In an effort to deal with formidable rival Amazon, eBay has launched a new program giving shoppers guaranteed three-day shipping on 20 million products. Called “Guaranteed Delivery,” it also includes free shipping on “millions” of those items, according to eBay, and will roll out in the US starting this summer. The auction site also revamped its home page today to provide a more personalized experience for shoppers.
eBay points has 1.1 billion items listed at any given time, of which 67 percent already ship for free and 63 percent arrive within three or fewer days. With “Guaranteed Delivery” items, however, eBay will either refund the shipping or give you a coupon if they don’t arrive on time.
Buyers looking to get items speedily will be able to filter searches based one-, two- and three-day delivery times. The company is also providing qualified sellers with new shipping tools so that they can in turn give buyers more accurate delivery times. Such things have become crucial for consumers in a world where Amazon can deliver an item within an hour and let you track packages to an exact spot on the map.
On top of that, eBay has launched a new home page that gives consumers a more, well, Amazon-like buying experience. Rather than forcing customers to follow sellers they like to see preferred products, it will feature stacked horizontal image carousels. Rows will be arranged by items that you’ve viewed recently and put on watch lists curated by eBay’s algorithms. That works much the same way as content on Netflix, and eBay compared its new offering specifically to that service’s home page.
It’s not the first time eBay has changed its design — in 2012, it adopted a Pinterest-like list of offerings, and a year later, a look based on the user’s interests. However, both of those designs required some user intervention and “that was a lot to ask a casual customer,” eBay’s Head of Personalization and Engagement Bradford Shellhammer told Recode.
The desktop rollout redesign is now live, but it may not reach your neighborhood until mid-2017, and the mobile refresh is coming at the end of the year — something that should tell you a lot about who eBay’s customers are. As mentioned, the Guaranteed Delivery service will arrive this summer in the US.
Via: Techcrunch
Source: Ebay
Apple Seeds Eighth Beta of macOS Sierra 10.12.4 to Developers and Public Beta Testers
Apple today seeded the eighth beta of an upcoming macOS Sierra 10.12.4 update to developers and public beta testers, four days after seeding the seventh macOS Sierra 10.12.4 beta and two months after releasing macOS Sierra 10.12.3.
The eighth macOS Sierra 10.12.4 beta is available for download through the Apple Developer Center or the software update mechanism in the Mac App Store for those who have previously installed a beta.
macOS Sierra 10.12.4 brings iOS’s Night Shift mode to the Mac for the first time. First introduced on iOS devices with iOS 9.3, Night Shift is designed to gradually shift the display of a device from blue to yellow, cutting down on exposure to blue light. Blue light is said to disrupt the circadian rhythm and is believed to interrupt sleeping patterns.
Night Shift can be activated through the Displays section of System Preferences, where a setting to have it come on at sunset and turn off at sunrise is available. Night Shift can also be toggled on manually through the Notification Center or via Siri.
Subscribe to the MacRumors YouTube channel for more videos.
The 10.12.4 update focuses mainly on Night Shift, but also includes dictation support for Shanghainese, cricket scores for Siri, improved PDFKit APIs, and iCloud Analytics options.
Few outward-facing changes have been discovered in macOS Sierra 10.12.4 outside of the first beta, as Apple has been working on bug fixes and improvements ahead of a public release. With the eighth beta out and the short interval since the last seed, we are getting closer to the end of the beta testing period and are likely to see a public release soon.
Related Roundup: macOS Sierra
Discuss this article in our forums
Apple’s iPhone 8 Said to Feature ‘Water Drop Design’ in Homage to Original iPhone
Apple’s upcoming 2017 iPhone will feature a design that’s similar to the original iPhone, according to industry analysts who spoke to Korean site ETNews [Google Translate].
The site says the iPhone will use a “water drop design” that’s an homage to the original iPhone, with a rear curve that is both gentler and rounder than existing metal case edges.
Apple is said to be using a three-dimensional glass material on the back of the iPhone 8 to make it more closely resemble the deeper curves on the case of the original iPhone. It would, of course, be much larger than the first iPhone, with rumors suggesting a 5.8-inch display, and it would undoubtedly be much thinner.
The “3D glass case” is said to “make curves” around the top, bottom, left, and right edges of the iPhone, moving away from the less curved, flat back design that was introduced with the iPhone 4 and has been used in every iPhone up to the iPhone 7.
Apple used a curved aluminum and plastic design for the original iPhone and plastic alone for the iPhone 3G. Shape wise, the iPhone 8 is said to resemble the first iPhone, but it will use all glass, similar in design to the plastic used in the second iPhone.
While an original iPhone-style curved back is rumored to be included, ETNews agrees with existing display information and suggests the OLED screen of the iPhone 8 will be “relatively flat.” It will not feature a dramatically curved edge like the Samsung Galaxy line.
There have been several mixed rumors about the curve of the iPhone 8’s display due to difficulty interpreting details about what constitutes a curve, but information seems to be aggregating around a 2.5D design that features a similar curve to the existing iPhone 7. We expect the display, which is said to be edge-to-edge with no side bezels, to curve just slightly downward at the edges much like the cover glass of the iPhone 7.
Little has been said about the rear design of the 2017 iPhone, so there is no information to back up the claims shared by ETNews as of yet. Rumors do, however, suggest that it will indeed use an all-glass body, with Apple planning to move away from aluminum.
There has also been speculation that the 2017 iPhone will be celebrated and introduced as a 10th anniversary device, and in that light, an homage to the original iPhone makes some sense, but these design rumors should be viewed with some skepticism until confirmed by additional sources.
Related Roundup: iPhone 8 (2017)
Tag: etnews.com
Discuss this article in our forums
What’s in a Name? Meet Bixby – the smart sidekick who’ll help you use your digital gear
When Samsung launches the Galaxy S8 and S8 Plus smartphones on March 29, the team of virtual assistants aiming to scour your inbox and tidy up your digital life will get just a little bit more crowded.
Alexa, Siri, Google Assistant, Cortana: Meet Bixby.
There’s a new name in artificial assistants, but Samsung argues this one won’t tell you dumb jokes or a weather forecast, nor will it look up facts for you online. This bright assistant is meant to improve your interactions with your digital life — not just your smartphone but your washing machine, your thermostat, your vacuum cleaner, everything. It’s nothing less than a rethink of how we use our stuff.
Sure, those are bold words, but the head of research and development at Samsung Mobile Communications Business Group believes them.
“Philosophically, what we’re looking at is revolutionizing the interface,” Injong Rhee told Digital Trends.
A part of that means the rumors are true — there will be a dedicated Bixby button on the Galaxy S8 and S8 Plus.
More: Samsung Galaxy S8 rumors and news leaks
But what does the assistant do? And how is it different from all of the other assistants that clutter our phones and our lives, listening in, chiming in, and chirping up to help us out? To find out, Digital Trends flew almost 7,000 miles to South Korea to talk natural language interfaces and machine learning and to answer the most vital question: Should you hand your virtual dayplanner to Bixby?
Planning the AI revolution
“After 10 years of smartphones, another revolution is waiting,” Rhee explained. We were in a conference room in Digital City, one of the consumer electronics giant’s office complexes in Suwon, South Korea, about half an hour south of Seoul.
Digital City is about two miles wide, spanning multiple office towers and residential buildings. It’s as big as a prep school, and structured like one. There’s broad boulevards, an open air plaza called Central Park (with piped-in bird sounds!), and a giant underground shopping and living complex that holds two fitness centers, relaxation zones, a Samsung store, drugstores and retail chains — even a Dunkin’ Donuts.
Jeremy Kaplan/Digital Trends
Jeremy Kaplan/Digital Trends
Jeremy Kaplan/Digital Trends
Jeremy Kaplan/Digital Trends
About 20,000 people work in Digital City, most commuting in from nearby suburbs. They eat three meals a day at company dining halls, swelling the corridors from 11:30 to 1:30 for lunch, an army of people looking at cell phones, working on cell phones, planning new cell phones, shopping for cell phones.
Rhee has the good looks of a rock star — and a haircut to match. He’s charming to listen to, engaging, and clearly passionate about Bixby, which he thinks will be an extraordinary tool. You see, machine learning powers the army of assistants that have popped up in recent years, from the super intelligent IBM Watson system that can outsmart chess grandmasters and dreams up its own recipes to less agile bots like Siri, the friendly assistant that lives in your iPhone and sets appointments for you, tells you whether you need a jacket, and finds information. While machine learning is enormously important, making it useful to consumers has been the challenge.
As Siri is to the iPhone, or Google Assistant is to a Pixel, Bixby will be to the Galaxy S8 — something baked in and key to the interface.
Forget Watson – that’s a whole different class of system. Assistants aimed at consumers like Alexa, Siri, and Google Assistant (why no cute name, Google?) seem best suited for looking up information, Rhee argues: they don’t really help you use your gadget. Think about the term “user interface”: Bixby aims to put the “use” back in it.
“Look at the number of tasks current agents can perform,” Rhee said. “ It’s about a hundred. We’re looking at covering everything you can do with a touch command. This is a very ambitious goal.”
Samsung’s Gallery app can do 300 tasks, for example, and there are about 15,000 ways people can perform those tasks with their fingers using menus and so on. Bring in voice and it gets complicated. Rhee said the number of ways people can speak those commands “varies over millions,” and today’s assistants don’t help us do the vast majority of things we’d like to with our phones, much less comprehend the millions of ways we could ask for help doing them.
What can Bixby do?
Here’s the thing: This information is all based on promises from Samsung. The company couldn’t provide us with a lot of concrete answers on what Bixby can or can’t do, so we’ll have to wait and see when it launches alongside the Galaxy S8.
What we do know is that as Siri is to the iPhone, or Google Assistant is to a Pixel, Bixby will be to the Galaxy S8 — something baked in and key to the interface. Bixby can perform informational query searches, but unlike those others its primary role is to help you use your phone.
More: Samsung is buying the new AI assistant from the masterminds behind Siri
Ask Bixby to find that photo you took last week of a pink umbrella, brighten it up a bit, and email it to Cousin Fester, and it’ll make that all happen for you. Samsung calls it multimodal capability, and one of the clever things Bixby does is dealing well with problems in it. When a command doesn’t contain sufficient information for the assistant to complete it, Bixby will still take the user as far as it can, rather than skipping the entire thing. It’ll create that email and attach the photo, even if it doesn’t recognize Cousin Fester as Thelonious.Fester@aol.com.
But to do all this well, it needs to be able to understand natural language better than others. So can it?

Injong Rhee presents during the 2016 Samsung Developer Conference.
Samsung
“Yes with an exclamation mark!” Rhee told us. That remains to be seen, of course. While we spent a good deal of time talking with Rhee about the assistant, Digital Trends was not yet able to test the functionality ourselves.
At launch, there will be 10 preloaded apps that Bixby can work with, including Gallery, Contents, Settings, Camera, and so forth. Expect Calendar functionality in the second wave of Bixby-ready apps, which should come a month after the first batch, and more updates to follow on a similar timeframe. To use the service, you can simply speak to it (“Hi, Bixby!”) as you can with other assistants, but Samsung thinks that’s not the most intuitive way to use an assistant.
The Bixby button
Instead, Rhee believes all Bixby-supporting gadgets should have a button, whether that’s a smartphone for walkie-talkie style communication — in which case there’s no need to say “hi Bixby,” of course — or a washing machine with a biometric fingerprint scanner you press before asking Bixby to run a heavy cycle with a low spin speed. Bixby could be everywhere, the glue that holds together your smartlife. Picture a button on the side of your remote control, Rhee said: push it to ask your phone to find that photo you sent Cousin Fester and throw it up on the television.
“It starts off with smartphones, but anywhere that has an internet connection and a microphone, Bixby can be used,” he explained. “There’s some part of the technology that we put in the device, but a lot of it lives in the cloud.”
The interface
There are four different components to Bixby, Rhee explains: Voice Agent, Vision, Home, and Reminders. Voice agent is just what you’d expect – speak to Bixby and you’ll get a response or complete an action. Reminders is another straightforward one. The other two are more complex.
Think about the term “user interface”: Bixby aims to put the “use” back in it.
Vision is Bixby’s ability to use a camera to recognize objects – a bottle of wine, for example: Bixby will surface several buttons with labels describing the item it sees (here’s that machine learning again). Push the wine button and it’ll tell you where you can buy that beaujolais and what it costs. Or it could divine points of interest, or recognize books, and so on.
If you think this sounds remarkably similar to Google Goggles, you’re not alone. We’ll see how Samsung develops this feature.
Then there’s Bixby Home, which lives on the screen to the left of the home screen. Like the Google app (formerly Google Now), it has cards that contain information, as well as suggestions about things it knows you like to do, and Samsung plans to make it expandable and to let third parties work with it. It will know context, whether you’re at work, or home. A short press on the button brings it up, as opposed to the long press that starts Bixby listening.
The Viv Labs connection
Samsung began working on Bixby in earnest about a year and a half ago, Rhee told us, but of course the company has had voice recognition and control around for years through the S Voice app. S Voice does the standard assistant stuff – it’ll start navigation to take you home, or tell you the weather.
More: Amid scandals and setbacks, Samsung completes its biggest acquisition ever
“Based on our experience with S Voice, we seriously really considered revamping that, and really changing that philosophy and foundation behind this to make it much easier for people to adopt,” Rhee said. “ And that’s when we started working on this, about 18 months ago.”
Bixby can’t do everything, and neither can Samsung – there was clearly no reason for the engineering team to reinvent the wheel. Google is really good at searching for stuff, for example, and Rhee’s team is working on a smooth way to hand off a request from Bixby to Google. That’s where Viv comes in.
“We’re bringing Viv Labs to grow that ecosystem in a scalable manner,” he explained. Bixby’s been in the making for years, but the recent acquisition of Viv – a company formed by the people who created Siri – will help make it great. The Viv team didn’t create Bixby, but some of the work they’ve done lately, notably efforts to make assistants expandable, will help distinguish this tool.
“Viv Labs is going to help out expanding into a third-party ecosystem,” Rhee said. “to make it easier to expose functions and perfect the experience for third parties.”
So what’s with that name?
Why Bixby, you ask? It would be great if there were an interesting answer — if someone’s uncle were Philip Bixby, or some guy saved Rhee’s dog from drowning, and as he walked off, he looked back and whispered “Bixby.”
Nope.
A group called Design Center came up with the name, and pulled it from a shortlist of three, whittled down from a list of thousands. It’s easy to recognize from a machine perspective, having the right number of syllables for hot-word recognition. It can be male or female, they say, and is more a last name than a first name. It’ll appeal to millennials they think, a group that’s very friendly to new technology. Samsung noted that it’s also the name of a bridge in San Francisco, which is a little more evocative: “building bridges” between your gadgets and smart home and between machine and human and all that.
But before we get to the bridge, there’s still a lot of roadway left to build. Maybe that’s something Bixby can help with? Bixby, are you there? Bixby?



