Man somehow plays sax solo following brain surgery, for science
Why it matters to you
Generating custom 3D maps of the brain allows surgeons to perform brain surgery on patients without risking damage to particularly sensitive areas.
When 25-year-old Dan Fabbio was diagnosed with a brain tumor a couple of years ago, there was an added complication: The tumor was located in the part of his brain that’s responsible for music function. Fabbio’s job? Working as a music teacher in a school in New Hartford, New York. This began a cutting-edge research project involving Fabbio and a number of physicians and surgeons — with the goal of not only carrying out brain surgery to remove the tumor, but doing so in a way that was not going to negatively impact Fabbio’s musical abilities.
This meant designing a series of fMRI experiments that could be used, in the words of the investigators, “to map music in Dan’s brain.”
“Our goal in the Translational Brain Mapping Program is to carefully map each patient’s brain who comes to URMC for surgery,” Professor Bradford Mahon of the University of Rochester Medical Center told Digital Trends. “This type of personalized brain mapping is important because, while everyone’s brain is organized in more or less the same way, there is inter-individual variability in the precise location of specific functions. Furthermore, we look at the broader life of each patient who comes through our program, with the goal of preserving the humanity of each patient. If a patient is a musician, we are going to look closely at music processing; as another example, we have carried out ‘mathematics mapping’ in a math professor and in an accountant. We have mapped the ability to move the hands and use tools in a craftsman. Our goal is to provide the very best neurosurgical care to each patient while ensuring that when the patient leaves the hospital, they can go back to work, go back to their family, and go back to the things that they are passionate about.”
In the case of Fabbio, the information the team gathered was used to create a detailed 3D map of his brain — including notes on both the exact tumor location and music function. This was then used to guide the surgeons while they were operating, during which Fabbio was awake and repeating humming and language exercises he had learned before the surgery. This allowed the surgeons to know whether they were potentially disrupting a part of the brain associated with music processing.
After the operation was finished, Fabbio demonstrated its success by playing his saxophone in the operating room — causing the room to erupt in applause. He is now recovered and back to teaching music.
A paper describing the work was recently published in the journal Current Biology.
SoundHound redesigns its music app to be a more personal experience
Why it matters to you
SoundHound’s latest update to its music discovery app keeps track of your discovered songs, allows you to link other streaming services, and curates song charts to expose you to even more tunes.
On Wednesday, SoundHound Inc. introduced a complete redesign to its music discovery app with the launch of SoundHound 8. The new features — such as advanced music search, discovery, and connected streaming — provide a more tailored experience based on the user’s history.
Reminiscent of Shazam, SoundHound is specifically created to help you easily discover music playing around you. By tapping one button and having your phone to listen in for a few seconds, the app is able to identify exactly what song is playing. It will still recognize songs if you hum or sing into the app.
Aside from receiving a more streamlined look for version 8.0, any song you discover is now kept in your history. You are able to refer back to it to see where you heard a particular song as well as mark your favorite ones. Any song can be played back for free through the integrated YouTube player.
Apple Music or Spotify users also have the option to connect their accounts to SoundHound 8. You are able to build playlists using your discoveries and unlock real-time lyrics to display while the song plays.
If you are looking to expand your search discovery options when searching for what to listen to, it also curates top songs across a variety of genres. Since the app comes equipped with “OK Hound …” users are able to find go for the hands-free method and request songs, genres, or artists using voice commands.
SoundHound publicly introduced its search and assistant app — Hound — in 2016. It is similar to other digital assistants like Siri, Cortona, and Google Now, except for the fact that it claims to be faster and more accurate. But what differentiates it from other rest, is that it pushes toward a more conversational experience that feels natural when communicating with our smartphones.
If you run a search on a celebrity, the app will try and link your voice search to a music-related song or individual. But if you wanted to know more about a specific celebrity, SoundHound can provide you with age, top songs, or any other information you would like to know.
With more than 300 million global users, SoundHound is clearly seeing popularity amongst the music loving community. SoundHound 8 is available for download on both Android and iOS.
Scientists detect strange repeating radio burst on the other side of the cosmos
Why it matters to you
This repeating radio burst observation may lead to a better understanding of this baffling cosmic phenomenon.
It seems like every time we attempt to take a step toward better understanding our cosmos we are left with more questions than answers — a regular Bonini’s Paradox. Just a few years ago we didn’t even know that the cosmic phenomena known as fast radio bursts (FRB) — rare, bright, and inexplicable signals from beyond our galaxy — existed. And until recently, only one of these FRBs had been recorded on more than one occasion. However, last week, a team has recorded yet another repeating FRB.
The scientific community has been perplexed by these enigmatic signals for the past 10 years. Currently, the explanation behind these FRBs range from outbursts of neutron stars to some sort of propulsion system used by an alien civilization on the opposite side of the universe. Some have even suggested these signals are the result of dark matter — another space thing we know very little about — smacking into black holes.
In 2015, researchers once again “heard” an FRB known as FRB 121102 that was first observed in 2012. Just last week, UC Berkeley postdoctoral researcher, Dr. Vishal Gajjar, found FRB 121102 blipping yet again. Gajjar used the SETI Breakthrough Listen program at the Green Bank Telescope in West Virginia to make the discovery. Over the course of five hours of observation across the entire 4 to 8 GHz frequency band, Gajjar and the team uncovered 15 “new” pulses coming from FRB 121102. But what does it mean?
“The possible implications are two folds,” Gajjar explained to CNET. “This detection at such a high frequency helps us scrutinize many (of FRB 121102’s) origin models. The frequency structure we see across our total band of 4 to 8 GHz also allows us to understand the intervening medium between us and the source.”
According to Gajjar, the repetitive nature of this particular FRB and the series of hyperactivity may rule out the colliding black holes origin hypothesis. A more probable explanation could involve pulsars, however, what is exactly emitting these powerful signals is still largely a mystery.
“As the source is going into another active state means that the origin models associated with some sort of cataclysmic events are less likely to be the case of FRB 121102,” explained Gajjar. “It should be noted that they can still be valid for other FRBs.”
The SETI team has requested other researchers to take advantage of this current FRB 121102 heightened activity window. The latest findings on FRB 121102 and the more than 400 terabytes of data from the recent observation will be more throughly detailed in a forthcoming report. While the jury is still out FRBs, we’re still patiently waiting for the response to the interstellar Arecibo Message radio signal we transmitted into the cosmos in 1974. Perhaps when the aliens on the other side of the universe receive that, they’ll be equally as perplexed and intrigued…
Robots learn to understand the context of what you say
It can be frustrating when telling robots what to do, especially if you aren’t a programmer. Robots don’t really understand context — when you ask them to “pick it up,” they don’t usually know what “it” is. MIT’s CSAIL team is fixing that. They’ve developed a system, ComText, that helps robots understand contextual commands. Effectively, researchers are teaching robots the concept of episodic memory, where they remember details about objects that include their position, type and who owns them. If you tell a robot “the box I’m putting down is my snack,” it’ll know to grab that box if you ask it to fetch your food.
In testing with a Baxter robot, ComText understood commands 90 percent of the time. That’s not reliable enough to be used in the field, but it shows that the underlying concept is sound.
Of course, robots are still a long way from understanding all the vagaries of human language. They won’t know what you see as a snack unless you teach them first, for instance, and CSAIL wants to address this with future work. However, it’s already evident that systems like ComText will be crucial to making autonomous robots useful in the real world, where people generally don’t want (or expect) to issue explicit commands every time they need something done. You could speak to robot helpers almost as if they were human, rather than carefully choosing your words.
Source: MIT News, IJCAI (PDF)
Instagram says hackers swiped contact info for verified users
Instagram just suffered a potentially serious (and this time, very real) data breach. The social photo service is sending out alerts that intruders got access to the phone numbers and email addresses for a number of “high-profile” users by exploiting a bug in Instagram’s programming interface. The attackers didn’t obtain passwords, and Instagram says it has already fixed the bug, but it’s warning all verified users out of an “abundance of caution.”
We’ve asked Instagram for more details and will let you know if it can shed more light on the situation, such as when the breach happened and how many people were targeted. It’s also unclear if this is related to the recent hack that compromised Selena Gomez’s Instagram account.
The breach isn’t as severe as it could have been, but it’s definitely not what Instagram needs in the wake of the Gomez incident. The social network is growing very rapidly, but it might run into trouble if big-name users are hesitant to stick around over security fears. The apparently prompt fix suggests that Instagram is at least on top of these issues when they do come up.
Astronomers use AI to reduce analysis time from months to seconds
Gravitational lensing is when the image of a distant object in space — like a galaxy, for example — is distorted and multiplied by the gravity of a massive object, such as a galaxy cluster, lying in front of the smaller, faraway object. It’s a useful phenomenon that has helped scientists discover exoplanets, understand galaxy evolution, spot a super bright galaxy, detect black holes and prove Einstein right. But analyzing images affected by gravitational lensing takes a really long time, requiring researchers to compare real images with simulated ones. Just one lensing effect can take weeks or months to analyze.
But researchers at Stanford and the SLAC particle accelerator have found a way to reduce that time down to just a fraction of a second. The research team trained a neural network with half a million simulated lensing images over the course of a day. Afterwards, the networks — the team tested four different types — were able to pull out information from the images with precision rivaling that of traditional methods.
“The amazing thing is that neural networks learn by themselves what features to look for,” Phil Marshall, a researcher with the project, said in a statement. “This is comparable to the way small children learn to recognize objects. You don’t tell them exactly what a dog is; you just show them pictures of dogs.” Another researcher, Yashar Hezaveh, added that in this case, “It’s as if they not only picked photos of dogs from a pile of photos, but also returned information about the dogs’ weight, height and age.”
With new telescopes being built that will surely uncover more and more examples of lensing, faster methods like this one will be needed to sift through all of the data. And importantly, the neural network analyses can be done on just a laptop or a cell phone.
The team’s research was recently published in Nature and a second paper is currently being considered for publication in The Astrophysical Journal Letters. You can access a version of that article on arXiv.
Source: Stanford, Nature, arXiv
Robots are now better at targeting individual neurons than people are
The brain is a delicate thing, and scientists keep looking for high tech ways to make it easier and safer to to learn more about it. In the area of brain surgery, there have been smart scalpels that know the difference between tumors and healthy tissue, sensor-embedded plastic wrap to help doctors know just where to operate and even VR headsets to help surgeons monitor patients while they’re in the OR. Studying the brain leads to even better outcomes, too, and engineers at MIT have just published a paper about using robots to target individual neurons from inside a living brain in order to record their electrical signals.
Previously, neuroscientists relied on “patch clamping” to record the activity of individual brain cells. The method requires a tiny glass pipette into contact with the membrane of the neuron and then opening up a small pore in that membrane. The technique takes graduates and postdocs months to learn, and it’s commensurately harder in a living brain. The new MIT system uses a computer algorithm to analyze images from a microscope and then guide a robotic arm to the target neuron. This makes it much easier to study single neurons in live tissue and see how they interact with other cells to produce cognition, sensory perception and other important brain functions.
“Knowing how neurons communicate is fundamental to basic and clinical neuroscience, said senior author and MIT professor Ed Boyden. “Our hope is this technology will allow you to look at what’s happening inside a cell, in terms of neural computation, or in a disease state.”
The new computer-aided technique will open up the field to more researchers in time, which could yield some interesting in-depth studies that could help us understand disorders like Alzheimer’s or schizophrenia.
Source: MIT
TCL is reportedly reviving Palm with new devices in 2018
Chinese electronics company TCL boldly claimed it would release devices under the BlackBerry name to revive the brand, and its recently-released KEYone smartphone proved them right. But according to Android Planet, the company is eyeing another classic tech name to resurrect: Palm. And they could have new devices under its name by 2018.
TCL actually bought the Palm name back in 2014, four years after HP acquired the brand and then shuttered its products a year later after they underperformed. That seemed a tragic end for Palm, which had led the late 90s and 2000s consumer device market with its PDAs and early smartphones, like the Pilot and Pre, respectively. But it looks like TCL is going to introduce an undisclosed number of devices under the Palm name early next year.
That’s all we really know, thanks to an interview the company’s marketing manager Stefan Streit gave to Android Planet. While he wouldn’t divulge what kind of devices would be included, he did tease that smartphones could be a possibility. The only other thing he revealed was Palm’s intended place in TCL’s portfolio. Rather than try to spice up the brand for new consumers, Streit mentioned that the new Palm devices would be geared toward users familiar with the old ones that ruled the gadget world before the new millennium. Whether that impacts their design or just how they’ll be marketed is unclear.
We can assume that the new Palm devices will run Android, just like TCL’s KEYone BlackBerry phone. The operating system Palm originally developed for its devices, webOS, was included in its acquisition by HP and then sold off to LG, which continues to use it in its tablets and Smart TVs.
We’ve reached out to TCL for comment and confirmation, and will add it here when we hear back.
Via: 9to5Google
Source: Android Planet
Magic Leap’s rumored AR glasses may have been revealed in patent
Magic Leap’s much-hyped augmented reality system has been an object of skepticism ever since the company was funded at a high level back in 2014. The tech world seems fairly obsessed with the possibilities, as is the company’s founders, but no one is quite sure what the ultimate product will entail. We’re a bit closer today with a newly granted patent (originally filed in 2015) for a smallish set of eyewear that could be the delivery system for Magic Leap’s AR system.

The 10-page patent document has eight views of the AR glasses, which seem to include dual cameras on each side of the eyepieces, a long wire attached to each earpiece and some peripheral vision shields on top bottom and sides of the glasses. A Magic Leap spokesperson told Business Insider that the pictured images are not the actual product, however. “As you know, we file lots of patents that take a long time to get approved and so what you are looking at is not our product,” she said. Other sources inside the company told Business Insider that while the basic design is similar to what is being used in-house, the current hardware is bulkier to include a depth sensor between the lenses. One source said there is only one camera on each side of the specs, as well, and that they look like thicker-rimmed hipster glasses. We’ve reached out to the Magic Leap for comment and will update this post when we hear back.

Whether these specific drawings reflect current hardware or designs from two-years ago, it’s still nice to see some movement from the much beleaguered startup. Plus, these smaller spectacles are a pretty significant improvement, size-wise, over the previous backpack-style systems we’ve seen from Magic Leap.
Via: Mashable
Source: US Patent Office, Business Insider
Instagram Bug Let Hackers Access Data From ‘High-Profile’ Instagram Accounts
Multiple high-profile Instagram accounts belonging to celebrities may have been breached due to a bug related to an Instagram API, reports TechCrunch.
The bug, which has since been fixed, allowed at least one hacker to access the email address and phone number on an unknown number of “high-profile” Instagram users. According to Instagram, the only data accessed was email address and phone number, and no passwords were exposed.
It’s not clear how many users were affected, but TechCrunch speculates it’s a small number as hackers were only targeting celebrities.
Instagram notified all verified users of the breach in an email, letting them know the bug has been addressed and that company’s security team is further investigating the issue. Instagram says verified users should “be extra vigilant” as they may receive calls, texts, or emails.
Instagram recommends all users implement two-factor authentication to as a precaution to prevent unauthorized access to their accounts even though no passwords were stolen.
Tag: Instagram
Discuss this article in our forums



