Skip to content

Archive for

18
May

Google Lens is Google Goggles on steroids — and everything Bixby Vision should have been


google-lens-io-3.jpg?itok=6jZNZaA8

Google’s big data advantage might help it surpass the Galaxy S8’s fledgling Bixby feature.

Google Lens was one of the major announcements of the I/O 2017 keynote, as Google revealed the latest step in its visual search journey. This is an endeavor which can be traced back to Google Image Search years ago, and which is a close relative of the AI powering Google Photos’ object and scene recognition.

As a part of Google Assistant, Google Lens has the potential to reach every Android phone or tablet on Marshmallow and up, letting these devices recognize things visually (with a little help from location data) and conjure up information about them. For example, you might be able to identify a certain flower visually, then bring up info on it from Google’s knowledge graph. Or it could scan a restaurant in the real world, and bring up reviews and photos from Google Maps.

Whether it’s through a camera interface in Google Assistant, or after the fact through Google Photos, the strength of Lens — if it works as advertised — will be the accurate identification and the ability to provide useful info based on that. It’s not too much of a stretch to imagine the feature might well be baked into the Google camera app on the next generation of Pixel phones.

Big, BIG data

google-lens-google-io.jpg?itok=Oq7Th0k_

Like all the best Google solutions, Lens is a product of AI and data.

Like all the best Google solutions, Google Lens is rooted in big data. It’s ideally suited to Google, with its vast reserves of visual information and growing cloud AI infrastructure. Doing this instantly on a smartphone is a step beyond running similar recognition patterns on an uploaded image via Google Image Search, but the principles are the same, and you can easily draw a straight line to Google Lens, starting with Image Search and going through Google Goggles.

Back in 2011, Google Goggles was impressive, futuristic and in the right setting, genuinely impressive. In addition to increased speed, Google Lens goes a step beyond this by not only identifying what it’s looking at, but understanding it and connecting it to other things that Google knows about. It’s easy to see how this might be extended over time, tying visible objects in photos to the information in your Google account.

The potential for Google Lens is only going to grow as Google’s capabilities in AI and big data increase.

At a more advanced level, Google’s VPS (visual positioning system) builds on the foundations of Google Lens on Tango devices to pinpoint specific objects in the device’s field of vision, like items on a store shelf. As mainstream phone cameras improve, and the trend towards multiple lenses in high-end phones continue, there’s every chance VPS could eventually become a standard Lens feature.

What Bixby Vision should have been

bixby-vision-shopping.jpg?itok=VSV5UMwl

The potential for Google Lens is only going to grow as Google’s capabilities in these areas become stronger. And the contrast with one of the Galaxy S8’s most publicized features is pretty stark. Samsung is still a relative newcomer in AI, and that’s reflected in the current weakness of Bixby Vision.

Right now Bixby can help you identify wine (badly), as well as flowers (sometimes) and animals (to varying degrees of success), and products, through Vivino, Pinterest and Amazon respectively. Samsung doesn’t have its own mountain of data to fall back on, and so it has to rely on specific partnerships for various types of objects. What’s more, while Samsung can (and apparently does plan to) bring Bixby to older phones via a software update, Google could conceivably flip the switch through Assistant and open the floodgates to everything running Android 6.0 and up.

Anyone who’s used Bixby Vision can attest that it just doesn’t work very well, and Google Lens seems like a much more elegant implementation. We don’t yet know how well Lens will work in the real world, but if it’s anywhere near as competent as Google Photos’ image identification skills, it’ll be something worth looking forward to.

18
May

Google Daydream: What does it do, what devices support it and what is standalone Daydream?


When it comes to virtual reality, Google has been very active in getting cheap devices into people’s hands, thanks to Google Cardboard, and even supplying them with plenty of content to view, through YouTube.

In 2016 Google introduced Daydream, a new VR platform for Android devices that’s built right into Android Nougat and supported by Google’s own viewer hardware, the Daydream View. In 2017, Google moved this into standalone VR, announcing plans for headsets that don’t require a phone or connecting to a computer.

Here’s everything you need to know about Daydream, Google’s fresh take on mobile VR.

Google

What is Daydream?

Daydream effectively simplifies access to virtual reality content on a mobile device.

It comes in three key aspects. There is an optimal specification list that manufacturers must meet for a smartphone to be labelled Daydream-ready.

There is a Google-made Daydream View VR headset, although multiple manufacturers could also build their own designs (as long as they meet Google’s standards).

And there is an all-in-one hub for VR content. Daydream Home is a one-stop shop where you can start virtual reality apps or view videos while wearing the headset itself.

Why do we need Daydream?

At present, virtual reality content is fragmented. It is available from different places, but rarely all accessible from the one central location. Anyone who’s gone through the charade of watching 360-degree YouTube videos on a Samsung Gear VR will know what we mean. You have to jump through several hoops just to get to the content you want to view.

Daydream is designed to solve that, at least for Android device owners. It will house the content from all mobile VR developers, no matter who they are. Sources big and small will be immediately accessible through the hub.

In addition, while Google Cardboard has been a fun and easy device to use to get a flavour of what VR is about, it’s hardly high-tech or, in many cases, comfortable. The Daydream View headset is a much more comfortable and practical approach, rather than the ad hoc nature of Cardboard viewers.

Pocket-lint

What devices will work with Daydream?

Daydream viewers

To use Daydream, you’ll need a Daydream viewer of some sort. At the moment, the only Daydream headset available is Google’s own Daydream View. As well as the Daydream View headset there is also a specific Daydream remote control, which comes with the Google device. It is designed by Google and enables users to interact with apps without having to tap the side of the headset – as in the case with the current Samsung Gear VR – or fiddling with other on-headset controls.

  • Google Daydream View review: A Pixel-perfect VR experience?

At the launch of Daydream, many partners were announced, with the likes of HTC, LG, Huawei and others listed. So far there’s been no other viewers launched, although some details have been shared about the Huawei VR headset. There’s no launch date on it, but we suspect it will be alongside a future phone, like the Huawei P10.

  • This is Huawei’s Daydream VR headset

Google

Daydream-compatible smartphones

The list of manufacturers that have so-far committed to releasing Daydream-ready phones includes Samsung, HTC, LG, Xiaomi, Huawei, ZTE, Asus and Alcatel. To be classed as Daydream-ready, you need to conform to Google’s guidelines, although many flagship devices will qualify in 2017.

The following phones have been launched as Daydream ready, meaning you’ll be able to use Daydream on these devices:

  • Google Pixel (review)
  • Google Pixel XL (review)
  • Huawei Mate 9 Pro (preview)
  • Porsche Design Mate 9 (preview)
  • Moto Z (review)
  • Moto Z Force
  • ZTE Axon 7

The following devices have been announced as “coming soon” to Daydream:

  • Asus ZenFone AR
  • Samsung Galaxy S8 (review)
  • Samsung Galaxy S8+ (review)

Google confirmed that Samsung would be issuing a software update to add Daydream compatibility.

Qualcomm has also been vocal in supporting Daydream, saying that its Snapdragon 821 is Daydream ready, and that’s what you’ll find in the Google Pixel. However, even though you have the right hardware, the device manufacturer has to decide to support Daydream. For example, the new HTC U11 isn’t Daydream compatible, because HTC doesn’t see that as a priority (or would rather push its own Vive headset).

What about standalone Daydream VR?

Google and Qualcomm announced at Google I/O 2017 that they have partnered on a standalone Daydream VR headset. Building on the Qualcomm Snapdragon 835 platform, the companies have produced a reference device that will give you a VR experience with no need to slip a phone into the front and no need to plug it into a computer. 

Standalone Daydream VR headsets are due to hit the shelves later in 2017.

Using technology from the Tango project, it comes with what Google is calling WorldSense positional tracking, with no need for any external cameras. This is enabled by using external sensors, so that all movement is accurately tracked, with 6 degrees of freedom.

Lenovo

Lenovo Daydream VR headset

Lenovo has confirmed that it will be building a VR headset, but hasn’t revealed anything further. The company worked to produce some of the frst Tango headset, so has some experience in this area.

HTC

HTC Daydream VR headset

HTC has confirmed that it will be launching a new Vive device and it will be available later in the year. HTC obviously gave birth to Vive, one of the most accomplished VR headsets to date, so its experience will pay dividends in creating a new device.

What apps will be compatible with Daydream?

Many third-party app announcements are yet to be made, but Google launched a YouTube VR app and Daydream Keyboard.

We also spotted Netflix VR, HBO Go and Hulu on the list, and the company revealed that an app based on JK Rowling’s Fantastic Beasts and Where to Find Them will be coming too.

There will be plenty of games too, of course.

When is Daydream coming and how much will it cost?

Google’s Daydream View VR headset and accompanying remote control are available from Google’s own online store. You will need a compatible smartphone to use the headset.

Shipping started on 10 November in the UK, US and other countries. Priced at £69 in the UK, $79 in the US, Daydream View is available in slate, snow and crimson colours.

The Daydream app also launched in the Google Play Store on 10 November. This app serves as the main hub for Daydream View headset. With it, you can find and play with Daydream-compatible apps installed on your phone. You will also have access to Google Play Store to download other apps and experiences.

Before downloading the app, you may have to update your phone to the latest version of Android 7.1 Nougat.

18
May

Google I/O 2017: All the announcements that matter


Google I/O 2017 is done and dusted.

Google holds an annual developers conference in order to get its developer community up to speed with all the updates it plans to push out to its products. Doing so gives developers a chance to get all their apps, services, integrations, and tie-ins ready before the updates actually roll out. However, consumers love to tune in as well, so Google usually makes a few announcements to get everyonel excited.

So, what does Google have in store for us all in 2017 and beyond? Here’s everything you need to know about Google I/O this year, including all the announcements made and how you can re-watch the main keynote in case you missed the original airing.

When is Google I/O 2017?

Google I/O 2017 will take place between 17 May and 19 May 2017 at the Shoreline Amphitheatre in Mountain View, CA. Of course, to most of us, it’s the opening keynote that is the most interesting. The company uses this to show off all the developments we’ll get on our devices later in the year. This year’s keynote happened on 17 May at 10 am PT (6pm British Summer Time).

Where is the Google I/O 2017 livestream?

You can re-watch the keynote right here!

Google’s I/O webpage has also been updated with the conference’s agenda and announcement updates.

What happened at Google I/O 2017?

Here are all the announcements that matter:

Pocket-lint

Android O

  • How to get Android O on your phone right now
  • What is Android Go and why does it matter?
  • Android Go is Google’s latest attempt at optimising budget Android phones

Google released the first developer preview of Android O in March, but now, it’s released the official preview of Android O, along with its new Android Go, a version of the operating system meant to power low-end devices in emerging markets. During the keynote, we got demos of picture-in-picture, Notification Dots, smart copy-and-paste, and other features designed to improve speed, security, and battery.

Google Home

  • Google Home can now make hands-free phone calls like Amazon Echo
  • Google Home can now give you visual responses on phones and Chromecast

Google Home, which is Google’s Amazon Echo-like smart speaker loaded with Google Assistant, has been updated with a bunch of new features. It now will have “proactive assistance”, also known as push notifications, as well as hands-free free calling (outgoing only, at launch), Spotify, SoundCloud, and Deezer integration, and Bluetooth support. It can even launch HBO Now.

Another new feature is Visual Responses. It sends information to a display devices, including directions to your phone, or a calendar to your TV via Chromecast. This feature also enables you to interact with streaming services without having to talk to your TV. Also, because Google Home recognises your voice, it will pick and display the right info for you, if you have multiple users set up.

Google

Google Assistant

  • Move over, Siri: Google Assistant is officially coming to iPhone
  • Google Lens brings super powers to your phone camera

Google’s competitor to Alexa is only a year older but is growing more useful every day. For instance, it added support for Whirlpool and GE appliances. Assistant is even now available on iOS as a standalone app. Google also released a developer’s kit, meaning it will soon come to more devices. The kit includes Actions, too, allow developers to deliver the ability to perform transactions from request to receipt.

Google Assistant has also been improved by Google’s machine learning progress, the company said. Now you can type into Assistant on your phone and it will communicate with you about what you show it through Google Lens’ eye (the camera), including translations.

google

Google Photos

  • Google Photos gets better at sharing, creates actual photo books
  • Google Lens brings super powers to your phone camera

Google has added Suggested Sharing to its Google Photos service. It nudges you to share and recommends who to share with, plus it includes a feed of shared images. There are now Shared Libraries, too, which auto-saves photos from a group of people. You can then put those photos into Photo Books – a soft or hard cover book you can order. It’s comprised entirely of photos from your collection.

A new Google Lens feature is being integrated into Photos as well. It will help identify places and serve up information about them. In the Photos app, after you’ve taken photo, Google Lens can identify buildings, get directions and opening hours for them, or it can bring up information on a famous work of art. Or, if a friend screenshots contact details of a business, you can tap it and call it right then.

Pocket-lint

VR and AR

  • Google Daydream: What does it do and what devices support it?
  • Standalone Daydream VR is now a reality, HTC and Lenovo onboard
  • Google Daydream: Google’s Android VR platform explained

One year after introducing its Daydream View platform, Google has announced new phone partners and an unnamed Daydream-compatible headset that doesn’t need a phone. It’s a standalone wireless VR headset with built-in positional tracking. We’ll learn more about these devices later this year. Google also discussed its visual positioning service, which uses Project Tango to locate yourself indoors.

Is that it?

There were several other announcements, including that the ability to stream live and prerecorded YouTube 360 videos will be coming soon to the YouTube app on smart TVs. Google further said Gmail will get a smart reply feature that’s been available in Inbox. It launched a Google.ai division for learning systems, research tools, and applied AI. Then there’s Google Jobs, its platform that makes it easier to find jobs.

18
May

Google opens up its ‘instant’ apps to all developers


This time last year, Google unveiled “instant apps” — think of them as chunks of an application that can be run without downloading anything from the Play Store. For months after the announcement, Google only let certain partners build those bite-sized apps, but no longer. With the public launch of an SDK here at Google I/O, any developer can whip up an instant app of their own, and that’s good news for everyone involved.

Google Play product manager Ellie Powers said there are 50 instant apps in the wild already, and that the companies behind them have seen notable lifts in sales and engagement. That’s the business argument sorted, but instant apps are great news for average users, too.

Flagship smartphones come with more and more storage every year, but untold numbers of devices are still stuck with small memory allotments. For the people with more limited devices, the ability to access an app’s most crucial functions without eating up more space than needed could be a game-changer. And beyond that, downloading and installing apps takes time. With instant apps, however, all users would have to do is click a link; the applet would quickly load and let them do what they came for.

In a way, instant apps blur the line between web apps and native ones; you’d ideally get the speed of the former with the functionality of the latter. Sadly, creating instant apps isn’t nearly as fast as using them, so it’ll still be a while before they become the norm.

For all the latest news and updates from Google I/O 2017, follow along here

18
May

Twitter gives you more control over how it uses your data


Ever seen Twitter ads that were a little too relevant to your personal tastes? You now have better tools for dealing with them. Twitter has expanded both the control you have over your data inside its mobile apps as well as the amount of insight into how the social network uses that data. Venture into a “personalization and data” section in the settings and you can tell Twitter to not only avoid tailoring ads, but to stop customizing content based on location. You can even tell it to stop syncing personalization between devices, and there’s a master switch if you want to turn everything off at the same time.

And if you want transparency as to how that data is used, you’ll have it in spades. Twitter’s apps now show fine-grained demographic, interest and ad targeting data, with the option to edit that info if you’re concerned.

There’s a reason for all this extra privacy control: Twitter has updated its privacy policies, and it’s not all for the better. The company has updated how it shares anonymized and device-level data, some of it through “select partnership agreements.” It’s also expanding how it uses and holds on to data from third-party websites that integrate Twitter material. It’s all in the name of further personalizing services, Twitter says. It won’t store web visit data if you live in a country that’s part of the European Union or European Free Trade Association, but this might raise eyebrows regardless of where you live. You might want to review your personalization settings to make sure you’re completely comfortable.

Source: Twitter

18
May

Facebook battles clickbait on a post-by-post basis


Facebook has been steadily refining its attempts to fight clickbait articles over the years, and now it’s getting very, very specific. It’s updating its News Feed processing methods to account for clickbait on a post-by-post level, not just domains and Facebook pages. This should “more precisely” downplay the number of misleading stories cluttering your timeline, the social network says. Moreover, it’s promising a more exacting approach when it looks at individual headlines.

Until now, Facebook examined clickbait titles in a holistic way: it looked for both the exaggerated language (“you have to see this!”) and deliberate attempts to withhold info (“eat this every day”). Now, it’s considering those factors separately. The split promises a more effective approach to culling clickbait — in theory, shady writers are more likely to face punishment if they commit just one of the offenses.

Facebook doesn’t believe that most pages will see “significant changes” to the availability of their posts. If all goes well, this should only affect those publishers that count on clickbait to pump up their views. It probably won’t make a night-and-day difference in your News Feed, but don’t be surprised if you see more substantive links going forward.

Source: Facebook Newsroom

18
May

Nintendo’s ‘Arms’ has all the depth the ‘Wii Sports’ games lacked


For many, the Switch represents the Nintendo’s return to form. It’s the console that sheds both the name and the gimmicky motion controls that defined the Wii era of gaming. With traditional games like The Legend of Zelda: Breath of the Wild and Mario Kart 8 Deluxe leading the way, motion controls seem to be all but a thing of the past. Or they did, until Nintendo announced Arms — a gesture-based boxing game for the Switch that seems to lean heavily on Wii Sports’ legacy. It seems like a bizarre step backward, but don’t worry: It turns out that Arms isn’t repeating the mistakes of the Wii; it’s showing how Nintendo has learned from them.

In fact, distancing Arms from Wii Sports Boxing seemed to be the entire point of the game’s spring preview event. The company is proud of the game’s motion controls but stresses that it’s different from the simple waggle mechanic that defined the Wii’s minigame showcase. In Arms, the boxers’ telescopic limbs can be guided throughout the entire motion of a punch, allowing players to steer an attack as they extend toward the enemy. This means that to be effective, players need to follow through with their attacks, like in real life. It also allows the punches you throw to change direction halfway through their animation — creating a more nuanced experience than Wii Sports boxing ever could have offered.

Even so, Nintendo is finding the association with the Wii’s motion controls hard to shake. “I see this a lot,” Nintendo Product Marketing Manager JC Rodrigo told Engadget, mimicking the short wrist-flick motions that defined Wii Sports Boxing. “When it comes to flailing and kind of doing this, your arms take time to leave your body and travel across. If your opponent’s moving, and you just flick and don’t do anything, your hand will just go wherever.” It’s most players’ first guess at how to play the game, but Rodrigo says it’s a tactic that doesn’t work. “You will lose. Fast.”

There’s a lot more to the game than punching too. Players will need to learn specific gestures to block attacks and execute grabs and will have to master the art of controlling their character’s movement by leaning the Joy-Con controllers in just the right direction. It sounds complicated, but after weeks of testing each control scheme, Rodrigo said he’s found it to be the most versatile way to play. Even so, it’s not the only way to play. “There are button controls,” he said. “So there are controls for all different types.”

The core of the game is based around one-on-one arm-extending combat, but today Nintendo announced a few additional game modes. All of them play off of the game’s boxing mechanics in a slightly different way. Skillshot challenges players to knock down more targets than their opponent in a carnival-style shooting gallery, and the game’s Hoops mode only lets players score points by executing a successful grab move on their opponent to slam dunk them into a basketball hoop. There’s also a new four-player battle mode called Team Fight that pits two teams against each other while simultaneously handicapping both by tying each player to her teammate. Basically, if your partner gets thrown by an opponent, you do too — forcing players to work together to succeed.

As our time with Arms wound down, I left feeling a little nostalgic — not for the Wii Sports Boxing experience Nintendo is worried Arms will be mistaken for but for the original Super Smash Bros. on Nintendo 64. Like Smash, Arms is a lighthearted, competition-focused brawler with endearing characters, unique stages and excellently balanced gameplay. The game is easy to pick up, and it’s the perfect local multiplayer experience to share with friends — but it also seems difficult to master, which may lend it enough complexity to keep competitive gamers engaged. It’s this depth that sets it apart from Wii Sports Boxing and what will make the game a worthy addition to the Switch library of those who can see past the superficial similarities of Arms’ gesture-based control scheme.

Arms will likely wind up being a game very much like Splatoon — a fun, imaginative ideal for competitive multiplayer that doesn’t completely make sense until you try it for yourself. Fortunately, Nintendo seems to understand this: Nintendo Switch owners will be able to try Arms for free during Global Test Punch events, during the last weekend of May and the first weekend of June. If the free taste gets you hooked, you won’t have to wait long for the full game either: Arms launches for Nintendo Switch on June 16th.

18
May

First ‘Star Trek: Discovery’ trailer points to a fall release


It’s been a long wait since the first teaser trailer, but we finally have our first good look at CBS’ streaming-exclusive Star Trek: Discovery series. The first trailer promises a “fall” release window, and the accompanying press release reveals its run has been extended from 13 episodes to 15, and that it will include a “Talking Trek” aftershow. We’ve heard plenty about its cast (including the likes of Jason Isaacs, Michelle Yeoh and Sonequa Martin-Green), and production delays, but we’re still waiting for the show to appear so fans can finally decide if it’s worth signing up for CBS All Access.

Source: Star Trek (YouTube), CBS Press Express

18
May

Small island prison first to install anti-drone ‘forcefield’


Prisons have a drone problem, in that they’re being used to fly drugs and other contraband over walls and into the hands of inmates. Dealing with these airborne mules is tricky because you either need to hope they crash or catch their operators in the act, but one prison is taking a more proactive approach to stopping undesirable deliveries. Alongside other security upgrades, the small, 139-capacity Les Nicolles Prison in Guernsey, Channel Islands, is said to be the first in the world to receive an anti-drone fence. It’s not a physical barrier, but an invisible wall that jams pilot signals and stops drones from passing beyond its threshold.

Two British companies, Drone Defence and Eclipse Digital Solutions, adapted existing jamming technology to create the “Sky Fence.” A network of roughly 20 “disruptors” is being installed in and around the prison’s perimeter to create a 600-meter (nearly 2000-foot) high, virtual wall through which drones shall not pass. Well, that’s the idea when Sky Fence goes into operation next month, anyway.

As Drone Defence CEO Richard Gill explains, the invisible wall was developed a preventative measure that isn’t designed to knock airborne drones out of the sky. “It will look like it is bouncing off a forcefield. The operator’s video screen will go black and they will lose control. Drones made in the last few years are all designed to return to the last point at which they were under control if the signal is lost. It won’t bring the drone down because if it did and it hit someone or caused damage that would create issues of liability.”

Via: Gizmodo

18
May

MIXHalo uses your headphones to fix terrible concert sound


If you’ve ever been to a concert where everything sounded awful — perhaps because of the speakers, or the room acoustics — you’d understand the pitch behind MIXHalo. Developed by Incubus guitarist Mike Einziger, it’s a way for concert goers to hear the audio coming from the stage mixer with their own headphones. The company developed a custom WiFi technology that pipes the audio to potentially thousands of nearby phones with very low latency.

It might sound a bit pie in the sky, but MIXHalo proved that its technology works during a live demonstration at TechCrunch Disrupt today, which involved Incubus playing alongside one of its investors, Pharrell Williams.

TechCrunch Disrupt NY 2017 - Day 3

And, before you roll your eyes at the idea of an audience jamming to their own headphones, consider the fact that many musicians are already doing this with in-ear monitors. “We have a much clearer sound on stage than what’s in the audience,” Einziger said.

Tuning into the the MIXHalo session was pretty simple: I downloaded the app, joined its private wireless network, and then went back and pressed play within the app. It only works with wired headphones for now (sorry, iPhone 7 owners), because the company was focused on delivering a low-latency experience. That makes plenty of sense, after all. If you’re at a concert, even slight delays between what’s happening on-stage and coming through your headphones could be jarring.

As someone who’s particularly picky about my sound quality, I was surprised at just how good the MIXHalo stream sounded. I could clearly hear all the lyrics during Incubus’ Drive and make out individual instruments, something that’s rarely possible at concerts. I also didn’t lose out on the concert experience much, either, since I could still hear and feel the playback from the live instruments. I paid particular attention to the drummer during the demo, and surprisingly enough, I didn’t notice any delays between beats.

When I cranked my RBH EP-3 earbuds to max volume, I also didn’t hear any distortion or compression artifacts. It’s admittedly tough to judge audio quality accurately amid the noise of a live event, though. There was the occasional hiccup during the stream, but it corrected itself quickly. Incubus rounded out the event by backing Pharrell’s Get Lucky, a musical collaboration I never expected to sit through.

According to Engadget’s Roberto Baldwin, a music nerd who often spends weekends rocking out on stage, MIXHalo could be particularly useful for anyone concerned with hearing loss. Concerts can get incredibly loud, after all, so many regulars end up wearing earplugs to keep things at a manageable level. It could also be great for anyone hard of hearing, he notes.

MIXHalo is still developing its technology, but it hopes to head to concert venues by this fall.