Ford is using Microsoft’s HoloLens to change the way cars are designed
Why it matters to you
If carmakers can reduce the time and cost of designing cars, the savings could be passed on to the consumers.
Microsoft’s HoloLens celebrated had its first birthday earlier this year, and the company believes it can change all sorts of everyday activities through the power of augmented reality. That includes potential applications in the car world.
Virtual car design has gotten one step closer to reality, though some industry traditions are being kept around. Thus clay models are being used alongside virtual media to create mixed media. Microsoft calls this “mixed reality” and defines it as “the result of blending the physical world with the digital world.”
Lorraine Bardeen, General Manager, Microsoft HoloLens and Windows Experiences, shared news of the company’s latest collaboration with automaker Ford, which is “embracing the digital transformation of the modern workplace across the company to make people’s lives better in vehicles today while exploring evolving mobility solutions such as autonomous vehicles of tomorrow.”
One cited example is grille design — designers were able to speed the process of development from days with multiple physical models to just hours with a HoloLens and just one physical model. Designers can use the system to view their work in three dimensions without engaging in the time-consuming process of making a physical model.
In the car industry, full-size clay mockups of new designs are typically built and shown to executives for approval, but doing it all digitally could allow designers to incorporate any changes more easily. It would also save a lot of clay.
Ford already uses full-on virtual reality headsets to allow groups of designers to go over new cars — even when they’re not on the same continent. Teams can share their findings and notes in confidentiality with a lowered risk of leaking prototype designs.
The advantage of HoloLens is that it allows users to view virtual projections overlaid onto the actual environment. That makes it potentially easier to use in the real world.
Don’t be surprised to see automotive applications if it ever goes mainstream, though, if for no other reason than so automakers can feel good about being swept up in the technological zeitgeist.
Update: Included information on Microsoft’s collaboration with Ford to make mixed media car design a reality.
How the CDC uses Google, AI, and even Twitter to forecast flu outbreaks
As summer gives way to fall, flu season is about to be upon us. Proper preparation is essential if there’s to be enough medical professionals and vaccinations to go around. The Centers for Disease Control and Prevention play a huge role in making sure practices and hospitals around the country know what to expect.
The CDC needs all the information that it can get to do this important work. Now, machine learning is bringing together a staggering amount of data — comprising everything from retail sales of flu medication to Google searches about symptoms — to create the best possible picture of the spread of the virus, as it happens. If it works, it could make predicting the spread of disease as commonplace as forecasting tomorrow’s thunderstorms.
Forecast face-off
Over the last four years, the CDC has run a forecasting research initiative intended to build better methods of predicting what flu season will bring.
Participants are invited to submit their own forecasting systems, which are judged stringently based on their accuracy. Each system needs to forecast when the season will start, when it’s going to peak, how bad it will be at its peak, and how bad it will be in one week’s time, two weeks’ time, three weeks’ time, or four weeks’ time.
The scope of this research goes well beyond the flu.
After that, participants are asked to submit a new forecast for each of these seven criteria every week through flu season, using new data that has been collected. Forecasts need to be made for each of ten regions comprising the U.S.
Once the flu season comes to an end, the forecasts are compared with the actual data that was collected. A total of 28 different systems were submitted to the CDC this year. Two of them were developed by Carnegie Mellon University’s Delphi research group, led by Roni Rosenfeld — and those two projects took both the number one and number two spots in the final ranking.
The CDC currently tracks the flu using a surveillance system. The key difference is that surveillance only looks at what’s happening right now, while forecasting can make a probabilistic statement about what’s going to happen in the future. The work being done by the Delphi group, among others, is poised to make a huge impact on the organization’s ability to plan for flu season – and the scope of this research goes well beyond the flu.
Sources of infection
There’s two main strands to the work the Delphi group is doing in conjunction with the CDC. The first is an improvement to the organization’s current surveillance techniques, which Rosenfeld refers to as ‘nowcasting.’ The aim is to make this data available in as close to real-time as possible, without sacrificing any accuracy.
“It takes a while to collate all these numbers, compile them, check them, and publish them,” Rosenfeld explained in a phone call with Digital Trends. “So as a result, when the CDC publishes their surveillance numbers online, they actually refer to the previous week, not the week that we’re in. So, they’re already between one and two weeks old.”
Vladimir Gerdo/Getty Images
The researchers are supplementing the data that the CDC collects with various other sources. They’re taking information from Google Trends, statistics regarding how many people access the organization’s online resources pertaining to the flu, and Wikipedia access logs. They’re even starting to take tweets about the flu into account, as well as retail sales of flu medication.
However, some of these sources don’t always measure how many people are getting the flu. They might instead indicate the level of flu awareness.
“If there’s unusual news coverage of flu — maybe because a celebrity got the flu, or something — you would expect to see that influencing how many people search for flu on Wikipedia, or on Google,” said Rosenfeld. “But it would not influence how many people are hospitalized for flu.” The system is being refined so that fake peaks, like the surge of web searches described above, aren’t considered.
They’re even starting to take tweets about the flu into account.
In terms of forecasting, the team is using a combination of three methods that have been developed over the past few years, bringing together models of flu dynamics with time series analysis methodology that’s commonly used by economists.
The results speak for themselves. Information released by the CDC gave Delphi’s Epicast system a “skill score” of 0.451, and its Stat project scored 0.438 — where perfect predictions would have earned 1.00. For comparison, assumptions of what was going to happen based on a simple average of previous data would have only scored 0.237.
That score might not seem like much compared to an ideal of 1.00, but it’s easier to see the strength of the Delphi team’s work when its compared to that of other groups taking part in the initiative. Typically, when different systems are averaged together, they cover for one another’s weaknesses and score better. However, even when all 28 submissions were combined to create an ensemble forecast, the system could only score 0.430 – a hair below Delphi Stat on its own, and well below Delphi Epicast.
Trickle Down
For the purposes of the CDC’s initiative, the Delphi group is working with the organization’s needs in mind. Its primary interest in a new forecasting platform is its capacity to improve its ability to time its response to the flu season.
“Flu can be very deadly for older people.”
The CDC needs to make public announcements about the flu season, and commence its vaccination campaigns at just the right time. If they’re too early or too late, they’re not going to be as effective as they could be.
For now, the CDC is the “main driver” behind the project, according to Rosenfeld. Going forward, he says that he can see the platform being used at state and county levels. Hospitals could even use its forecasting capabilities to help determine what their staffing and equipment needs might be.
Rosenfeld is excited about the prospect of individuals being able to use the forecasts to inform their own behavior. “If you have a mother or a mother-in-law who is 90 years old and wants to go visit their sister in Cleveland, if you know that flu is going to peak in Cleveland two weeks from now, it would be useful to be able to advise her not to go,” he explained. “Because flu can be very deadly for older people.”
It’s important to note that the forecasting isn’t exact — you’re not going to be told, definitively, whether you will or will not contract the flu virus by stepping foot in Cleveland. Rosenfeld compares it a weather station’s precipitation reports, in that it offers a general idea of where it will rain, and how much, over the coming days and weeks.
The Delphi group is working on influenza forecasting because the need is imminent, and data is plentiful, but its platform is capable of much more. The team is already using its technology to look at dengue fever, which kills thousands of people every year, and there are plans to apply the same tools to diseases and conditions including HIV, Ebola, and Zika.
This is a field known as epidemiological forecasting — and it’s blossoming.
Under the Weather
To put the current state of epidemiological forecasting into context, Rosenfeld compares it to weather forecasting, which entered its infancy in the U.S. in the 1860s.
“At the time that it started, people didn’t realize how useful it would be economically and socially, and how much it could progress,” he said of weather forecasting’s early years. “It took many, many years — many, many decades — of development across multiple dimensions.”
Meteorologists had to put infrastructure in place to collect measurements and readings, first around the country, and then around the world. They had to develop new statistical models, and do other mathematical work to put this data to use. New technology was needed to analyze their findings. Weather forecasting was among the first applications for early supercomputers.
“If you compare that to epidemiological forecasting, we’re at the very beginning,” Rosenfeld said. “We do have the computing power, we have a head start in that regard. But we need to develop the theory, and we need to develop the measurements.”
Rosenfeld hopes that the research that’s being done as part of this CDC initiative will demonstrate the broader potential for epidemiological forecasting. “It will take quite a few years to grow, and a significant investment,” he acknowledged. “We’re trying to make the case for it. We’re trying to start the work and show the vital benefits of forecasting.”
Rosenberg and his team have no small task ahead of them. Just as the benefits of weather forecasting weren’t immediately obvious, it’s difficult to accrue the necessary infrastructure and theoretical frameworks without the proper backing.
Working with the CDC has helped the Delphia group make some major advances in terms of influenza. The next step is to look at more infectious diseases, and continue to improve upon the forecasting being done. With any luck, the results will help medical practitioners see the thunderhead of an outbreak before it occurs.
Maze-like chip helps spot aggressive cancer cells
It’s difficult to spot cancer cells — just one in a billion blood cells are cancerous. How do you isolate them to know the trouble someone is facing and eventually treat it? By drawing the kind of mazes you enjoyed as a kid, apparently. Researchers have developed a microfluidic chip that uses a circular labyrinth to separate cancer cells from the rest of your bloodstream and spot the stem-like cells that will aggressively spread that cancer. Ultimately, it’s a creative use of physics. The curves tend to push larger cancer cells forward (smaller regular cells cling to the walls), while the corners mix things up and put white blood cells in an ideal position.
And importantly, it’s much faster than conventional methods, which use markers and traps to gradually bind cancer cells. Blood flows quickly, so you’re only waiting minutes to pinpoint the cancer. And if you need better results than you got the first time around, you just have to add another chip.
The technique could be the key to a new wave of cancer treatments. If you can single out aggressive cancer cells, you’ll have a better sense of how to treat the cancer in question. In an ongoing breast cancer trial, for example, it’ll show whether or not blocking a immune signalling molecule might reduce the number of stem-like cancer cells. This won’t necessarily lead to cures for stubborn cancers, but it could offer hope in situations where a cancer would otherwise be impossible to stop.
Source: University of Michigan, Cell
Office 365’s revamped web launchers put you to work sooner
Sometimes, it’s not your productivity apps that need a tune-up… it’s how you get to those apps that needs work. And Microsoft knows it. The tech firm has redesigned the Office.com front end and Office 365’s web app launcher. There’s now a recommended section that surfaces the activity that’s relevant to you, so you can quickly jump to where you’re needed — say, a Word file that needs edits or your recent PowerPoint presentation. You can also search for people, apps, documents and sites right from the get-go.
Things promise to be simpler once you’re waist-deep in work, too. There’s a streamlined Office 365 web launcher that focuses just on the most common apps and those you use often, so you’re not wading through menus to return to a favorite tool. You can pin apps if you know you’ll need to use, them, and there’s a prominent recent documents section that will switch you back to that all-important report. If you’re not sure which app to use, you can explore recommended apps to get resources, install desktop apps and otherwise determine whether or not it’s the right tool for the job.
Microsoft expects these updates to reach Office 365 users “soon,” so don’t be surprised if they aren’t visible right away. Whenever you get them, it’s clear that Microsoft is increasingly treating Office’s web version as its own sort of operating system, not just as a collection of apps. This won’t necessarily lure you or your company to Google Apps, but it could make the Office experience more cohesive.
Source: Office Blogs
This D&D 4K touchscreen custom table is a Dungeon Master’s dream
Why it matters to you
Old-school gamers are embracing the latest technology for a fun new retro gaming experience.
Dungeons & Dragons has grown in popularity the last several years, with new digital toolkits that make a Dungeon Master’s job easier to handle. No dice to lose, no rulebooks to thumb through, no smudged graph paper to track your progress. But veteran D&D fan Ken Hinxman has taken dungeon mastering to the next level with a home-built high-tech Dungeons & Dragons custom table that includes a 4K display and touchscreen capability.
Hinxman, who’s known as Caethial on Tumbler and Twitter, built the table in a single day on New Year’s Eve 2016 with two of his friends. Hinxman modestly notes that “there really are no plans for the table, we had a general idea of how we wanted it to turn out, but we made most of the decisions on it on the day we built it.”
The table itself was built with lumber you can buy at any hardware store, along with some brackets and a few screws. According to Hinxman, the table itself cost about $120 to make. It’s furnished with a 40-inch Samsung smart TV and a Dell Precision 5720 4K touchscreen workstation. The touchscreen is a nice touch, but the setup also works well with a laptop keyboard and mouse, as you can see in one of the gallery images above. Building one of these babies in your own gaming den will run you about $2,500.
Caethial’s table set off a firestorm on Reddit when it was first posted, with more than 75,000 upvotes (and counting) at present. We’ve featured custom gaming tables before, but this was a completely homebrew venture and the finished product is remarkable to behold.
The software Hinxman uses to run his games is Fantasy Grounds, a virtual RPG tabletop system that’s officially licensed by Wizards of the Coast and available on Steam. Hinxman says that he prefers Fantasy Grounds over other systems such as Roll20, noting that he finds it far easier to use. Although generally used for online play, he says it works quite well for home games and allows a “fog of war” effect where the map is gradually revealed to the players as they explore it.
Check out the photo gallery for some great images from Hinxman’s Tumblr page to see more detail of the build itself, as well as exciting action shots of a game in full swing.
This D&D 4K touchscreen custom table is a Dungeon Master’s dream
Why it matters to you
Old-school gamers are embracing the latest technology for a fun new retro gaming experience.
Dungeons & Dragons has grown in popularity the last several years, with new digital toolkits that make a Dungeon Master’s job easier to handle. No dice to lose, no rulebooks to thumb through, no smudged graph paper to track your progress. But veteran D&D fan Ken Hinxman has taken dungeon mastering to the next level with a home-built high-tech Dungeons & Dragons custom table that includes a 4K display and touchscreen capability.
Hinxman, who’s known as Caethial on Tumbler and Twitter, built the table in a single day on New Year’s Eve 2016 with two of his friends. Hinxman modestly notes that “there really are no plans for the table, we had a general idea of how we wanted it to turn out, but we made most of the decisions on it on the day we built it.”
The table itself was built with lumber you can buy at any hardware store, along with some brackets and a few screws. According to Hinxman, the table itself cost about $120 to make. It’s furnished with a 40-inch Samsung smart TV and a Dell Precision 5720 4K touchscreen workstation. The touchscreen is a nice touch, but the setup also works well with a laptop keyboard and mouse, as you can see in one of the gallery images above. Building one of these babies in your own gaming den will run you about $2,500.
Caethial’s table set off a firestorm on Reddit when it was first posted, with more than 75,000 upvotes (and counting) at present. We’ve featured custom gaming tables before, but this was a completely homebrew venture and the finished product is remarkable to behold.
The software Hinxman uses to run his games is Fantasy Grounds, a virtual RPG tabletop system that’s officially licensed by Wizards of the Coast and available on Steam. Hinxman says that he prefers Fantasy Grounds over other systems such as Roll20, noting that he finds it far easier to use. Although generally used for online play, he says it works quite well for home games and allows a “fog of war” effect where the map is gradually revealed to the players as they explore it.
Check out the photo gallery for some great images from Hinxman’s Tumblr page to see more detail of the build itself, as well as exciting action shots of a game in full swing.
Intel cancels Project Alloy wireless VR headset, but is still investing in AR/VR
Why it matters to you
It’s inevitable that we will cut the cord for augmented and virtual reality, and this project helped us to see the possibility.
During last year’s Intel Developer Forum opening keynote, company CEO Brian Krzanich revealed the company’s virtual reality project, aka “Project Alloy.” The headset was completely untethered, meaning users could move about freely without cords, and without an additional PC and battery strapped to their back.
And now it’s dead. Road to VR reports that Intel has cancelled plans for the device, which was supposed to launch the fourth quarter of this year.
“Intel has made the decision to wind down its Project Alloy reference design, however we will continue to invest in the development of technologies to power next-generation AR/VR experiences,” Intel said in a statement to Road to VR. “This includes: Movidius for visual processing, Intel RealSense depth sensing and six degrees of freedom (6DoF) solutions, and other enabling technologies including Intel WiGig, Thunderbolt, and Intel Optane.”
The tech conglomerate went on to say that Project Alloy turned out to be a great proof of concept, and showed what could be done with the technology.
We tried the first generation version of the headset at CES 2017 and found it to be a bit rough. We were expecting this with such an early model, but the problems hurt the usability of the product. The tracking was the main issue — the headset scans the room to make a map of objects. Unfortunately, it wasn’t very accurate and we found ourselves bumping into things.
Project Alloy featured a built-in battery on the backside of the device’s head strap. The only cord that was used during Intel’s keynote presentation was one to project the wearer’s viewpoint onto the big screen for all to see. Project Alloy did not require external sensors, nor did it require controllers for moving and interacting with the environment. Instead, the headset relied on a pair of Intel RealSense sensors.
Wearers had a full six degrees of freedom within the virtual world. The sensors could track every movement, enabling room-scale mobile VR. One sensor mounted on the front could even “scan” real objects and bring them into the virtual realm in real time, such as the wearer’s hand or another person’s head.
In a demo, the user approached a virtual X-ray machine with his real hand and saw the bones underneath. He then interacted with a virtual switch using his actual hand too. After that, the demo went on to reveal that Project Alloy supported multi-room environments, allowing the wearer to physically walk into the new virtual area.
Users aren’t limited to just their hands. The RealSense sensor could bring in any object that can be used to interact with the environment. In the demo, the wearer used a real dollar bill to shave a virtual spinning object. You could actually see the wearer’s hand holding the dollar bill in the virtual realm, not a rendered stand-in.
The unit was likely going to see a reduction in size as Intel fine-tuned the components within. The Intel-powered computer resided inside the device.
Intel was using Project Alloy to expand upon the foundation established by the Oculus Rift and HTC Vive by mixing the real world with a virtual environment. It was the exact opposite of Microsoft’s HoloLens headset, which projects holograms into the user’s view of the real world. However, Krzanich indicated that Project Alloy would be able to bring the virtual into the real world at some point in the project’s future.
Intel’s initial keynote opened up with the possibility of VR becoming so life-like it will be hard to distinguish it from the real world.
Update: Added information on Project Alloy’s cancellation and our hands-on experience.
Intel cancels Project Alloy wireless VR headset, but is still investing in AR/VR
Why it matters to you
It’s inevitable that we will cut the cord for augmented and virtual reality, and this project helped us to see the possibility.
During last year’s Intel Developer Forum opening keynote, company CEO Brian Krzanich revealed the company’s virtual reality project, aka “Project Alloy.” The headset was completely untethered, meaning users could move about freely without cords, and without an additional PC and battery strapped to their back.
And now it’s dead. Road to VR reports that Intel has cancelled plans for the device, which was supposed to launch the fourth quarter of this year.
“Intel has made the decision to wind down its Project Alloy reference design, however we will continue to invest in the development of technologies to power next-generation AR/VR experiences,” Intel said in a statement to Road to VR. “This includes: Movidius for visual processing, Intel RealSense depth sensing and six degrees of freedom (6DoF) solutions, and other enabling technologies including Intel WiGig, Thunderbolt, and Intel Optane.”
The tech conglomerate went on to say that Project Alloy turned out to be a great proof of concept, and showed what could be done with the technology.
We tried the first generation version of the headset at CES 2017 and found it to be a bit rough. We were expecting this with such an early model, but the problems hurt the usability of the product. The tracking was the main issue — the headset scans the room to make a map of objects. Unfortunately, it wasn’t very accurate and we found ourselves bumping into things.
Project Alloy featured a built-in battery on the backside of the device’s head strap. The only cord that was used during Intel’s keynote presentation was one to project the wearer’s viewpoint onto the big screen for all to see. Project Alloy did not require external sensors, nor did it require controllers for moving and interacting with the environment. Instead, the headset relied on a pair of Intel RealSense sensors.
Wearers had a full six degrees of freedom within the virtual world. The sensors could track every movement, enabling room-scale mobile VR. One sensor mounted on the front could even “scan” real objects and bring them into the virtual realm in real time, such as the wearer’s hand or another person’s head.
In a demo, the user approached a virtual X-ray machine with his real hand and saw the bones underneath. He then interacted with a virtual switch using his actual hand too. After that, the demo went on to reveal that Project Alloy supported multi-room environments, allowing the wearer to physically walk into the new virtual area.
Users aren’t limited to just their hands. The RealSense sensor could bring in any object that can be used to interact with the environment. In the demo, the wearer used a real dollar bill to shave a virtual spinning object. You could actually see the wearer’s hand holding the dollar bill in the virtual realm, not a rendered stand-in.
The unit was likely going to see a reduction in size as Intel fine-tuned the components within. The Intel-powered computer resided inside the device.
Intel was using Project Alloy to expand upon the foundation established by the Oculus Rift and HTC Vive by mixing the real world with a virtual environment. It was the exact opposite of Microsoft’s HoloLens headset, which projects holograms into the user’s view of the real world. However, Krzanich indicated that Project Alloy would be able to bring the virtual into the real world at some point in the project’s future.
Intel’s initial keynote opened up with the possibility of VR becoming so life-like it will be hard to distinguish it from the real world.
Update: Added information on Project Alloy’s cancellation and our hands-on experience.
Twitter Lite site gets the inevitable app, now being tested in the Phillipines
Why it matters to you
With 45 percent of global smartphone connections on 2G networks, lightweight versions of apps are key to success. Twitter is jumping on the bandwagon with Twitter Lite.
Facebook has done it, YouTube has done it, Twitter is doing it, too. We’re talking, of course, about offering a lightweight version of the platform to make it easier for users in countries with less robust access to data to use Twitter. In April, Twitter product manager Patrick Traughber published a blog post announcing the debut of Twitter Lite, described as “a new mobile web experience which minimizes data usage, loads quickly on slower connections, is resilient on unreliable mobile networks, and takes up less than 1MB on your device.”
Though the we do have Twitter Lite, it was inevitable that an app would be released, and Tech Crunch reports it’s being tested in the Phillipines.
For users in the Phillipines, the app can be found in the Google Play Store for those who have Android 5.0 and above. It has English and Filipino support, and can be used on 2G and 3G networks.
“The test of the Twitter Lite app in the Google Play Store in the Philippines is another opportunity to increase the availability of Twitter in this market,” a Twitter spokesperson told Tech Crunch. “The Philippines market has slow mobile networks and expensive data plans, while mobile devices with limited storage are still very popular there. Twitter Lite helps to overcome these barriers to usage for Twitter in the Philippines.”
The app is under 3MB and has a “data saver mode to download only the images or videos you want to see,” according to the app’s download page.
While smartphone adoption is growing at a rapid rate around the world, infrastructure isn’t necessarily keeping up. In fact, the GSMA reports, 45 percent of mobile connections remain on 2G networks. And given that smartphone adoption is now at around 3.8 billion connections, that’s a lot of phones on slower networks.
The Twitter Lite site not only requires less data, but promises 30 percent faster launch times and quicker navigation throughout the platform. Users can still be able to see the core components of the social media service, including timeline, Tweets, Direct Messages, trends, profiles, media uploads, and notifications without an app. No word if Twitter’s new night mode will come to the Lite platform.
And to make things more efficient still, Twitter’s data saver mode, could potentially reduce your data usage by up to 70 percent. Twitter Lite also offers offline support so you’ll be able to maintain your session even if your connection is spotty.
You can check out Twitter Lite at mobile.twitter.com on a smartphone or tablet. More information can be found at lite.twitter.com, and if you’re interested in learning how the tool was built, you can do that here.
Update: Added information about Twitter Lite app being tested in the Philippines.
Best new songs to stream: Arcade Fire, Mac Demarco, and more
Every week, there are thousands of new songs hitting the airwaves — and it’s just too much for your two ears to handle. With all those options, you can’t be wasting your time on tracks that deserve a thumbs-down click — you want the best new songs to stream right now.
But don’t worry, we’re going to save you the hassle. We listen to some of the most-hyped and interesting songs each week, and tell you which are worthy of your precious listening time.
Here are our five best new songs to stream this week. And don’t forget to subscribe to our Spotify page for a playlist of our weekly picks, which can also be found at the bottom of this post. Not sure which streaming service is best for you? Check out our post about the best music streaming services, or go in depth and learn the differences between Apple Music and Spotify to better weigh your options.
Arcade Fire — Mind Games (Live at Spotify)
Arcade Fire’s recent appearance at Spotify’s studio produced this slick rendition of John Lennon’s Mind Games. A thoughtful take on an oft-forgotten classic, it features a clean arrangement complete with beautiful string tones, as well as frontman Win Butler channeling the late Beatle.
Mac Demarco — Dreams From Yesterday
Mac Demarco recently moved to Los Angeles, where he’s been hard at work writing and recording even more chilled-out singles for his rabid fanbase. This 360-degree video from California’s KCRW radio showcases one of his recently penned numbers, a soft and groovy song called Dreams From Yesterday that is sure to please your ears.
Charlotte Gainsbourg — Deadly Valentine
Musician and actress Charlotte Gainsbourg pairs up with rapper Dev Hynes (aka Blood Orange) in this video for her recent single Deadly Valentine, which features young look-alike versions of themselves slowly evolving into the pair’s current selves on screen. The song itself is a forward-leaning, Daft Punk-like jam that will push you through even the toughest workout with ease.
Bibio — Phantom Brickworks III
If you’re looking for a gentle single, Bibio‘s Phantom Brickworks III is the perfect option. Soft voices and reverb-laden piano dominate the track, which belongs on the speakers of every day spa and massage parlor in the nation.
Dave Depper — Do You Want Love? (And more, live on KEXP)
Dave Depper has made a name for himself as a guitar-slinging sideman for some of the biggest names in indie rock, from Death Cab for Cutie to Ray Lamontagne, . Fresh on the heels of his first solo album, Depper and his excellent live band recently appeared on Seattle’s KEXP Radio, where they performed hyper-clean renditions of pop songs like Do You Want Love? and more.
That’s it for now, but tune in next week for more songs to stream, and check out the playlist loaded with our recent selections below:



