Court claims Google lost right to pull site from search results
You’d think that Google’s search results would be protected in the US by free speech rights. Google gets to say what what shows up on its own site, right? However, one Florida court thinks differently. It recently determined that Google wasn’t protected by the Constitution’s First Amendment when it pulled search engine optimization firm E-ventures’ website from its index. Google supposedly crossed the line when it claimed E-ventures was violating its policies by posting “pure spam” — this wasn’t strictly true, the court argued, and was driven by “anti-competitive motives” rather than self-expression.
The court also shot down Google’s attempts to use a Good Samaritan clause in the law that absolves it of liability for pulling content in good faith. It’s not clear that this is the case, according to the decision. Also, some of E-ventures’ complaint revolves around accusations that Google wasn’t acting in good faith.
We’ve asked Google for its take on the decision, but it’s already easy to see the company challenging this outcome. There’s already a precedent for Google having the right to order its search results under the First Amendment, for one thing. And as TechDirt notes, there’s a real risk of this giving some companies an escape clause whenever free speech issues come up. Don’t like that a search engine took down results for your content? Say it was an anti-competitive move. That doesn’t mean that Google will never violate the law through its search result strategy (the EU seems to think it does), but the Mountain View crew may still have a good case.
Via: TechDirt
Source: Google Scholar
Inhabitat’s Week in Green: Self-driving Uber cars, and more!
The self-driving cars are coming: This week Uber unveiled its first autonomous vehicle, while Google patented a sticky “fly paper” car hood that could protect pedestrians caught in self-driving car crashes. Los Angeles celebrated the completion of its first new metro line in 60 years, while France transformed its trains with beautiful impressionist art. And Taga launched an affordable cycle with built-in cargo space that could be the ultimate family bike.
It was big week for solar power as well. For starters, scientists developed a breakthrough photovoltaic cell that set a new world record for efficiency. Portugal made headlines by running on 100-percent renewable energy for more than four days. The Mistbox is a new device that uses solar energy to cut down on summer cooling costs. And scientists discovered a new form of spiralized light that flies in the face of everything quantum physics says about photons.
As the world’s population expands, food reserves are being stretched thin. Fortunately, researchers have developed a breed of corn that can double crop yields, and a new vertical farm can produce 16 acres of veggies on eighth-acre of land. Water is another pressing issue, and this week a team of scientists developed a paper-thin water filter that can remove bacteria as well as viruses at an affordable price. In other design and health news, researchers developed a new stem cell therapy that could cure blindness, and a UK man received a futuristic bionic arm with a USB port and built-in flashlight.
The After Math: Coming Soon
It’s been an auspicious week for big promises. Google announced a whole bunch of stuff at its 2016 I/O developers conference, though it didn’t have a whole lot in the demonstrable product department. Similarly, a trio of transportation companies announced that they’re working on self-driving cars, IBM showed off a new kind of optical storage system that will eventually do impressive things, Google patented human flypaper that will one day coat cars.
The biggest news from Google I/O won’t matter until this fall
Google I/O, the company’s massive annual developer event, has wrapped up for 2016. As usual, CEO Sundar Pichai and a host of the company’s executives gave the world a look at what it’s planning for the next year. Unfortunately, we’ll need to wait to see how everything works in the real world, as nearly everything Google announced won’t come to fruition for months. But that’s not meant to minimize what Google announced this week — indeed, many of the company’s biggest and most important products will look a lot different six months from now.
Easily the flashiest two announcements this week were Daydream and Google Home. The latter is Google’s first entry into a relatively new product category, but it’s powered by years of organizing knowledge across the internet as well as everything it learns about its users. That sounds creepier than it is in reality — if you’ve opted into products like Gmail, Calendar and Google Now, Google Home will use all the info you’ve stored there to make it a better product. As cool and surprisingly useful as Amazon Echo can be, it’s not hard to imagine that Google Home will trump it in a number of ways.
Daydream, meanwhile, is the company’s true VR ambitions revealed. Cardboard was how it got its feet wet; Daydream is how it’ll really make an impact. By leveraging the combined forces of Google’s hardware partners, the flexibility and power of Android and the company’s army of developers, Google could be looking to mimic the strategy that made Android so successful in the first place.
The news that Google is rethinking messaging apps yet again was met with less enthusiasm, but the most important part of the Allo isn’t smart replies — it’s the integration with the Google Assistant. That’s how Google is referring to the bot that lives inside the app, letting you ask questions with natural language. That same assistant is what’ll make Google Home work, and it’s highly likely we’ll be seeing the company refer to the Google Assistant much more as the year goes on.
Other announcements that were more subtle but no less important to Google’s overall strategy include Instant Apps and the plan to bring Android apps to Chrome OS. By the end of the year, Chrome OS’s perpetual app problem could be solved — and the platform has already been growing significantly without this huge new feature. And Instant Apps is a profound example of changing how we currently use smartphones. Not having to download and install apps you use infrequently could help keep our phones clutter-free.
Other Android news included a quite stable beta of Android N; in my limited testing it’s definitely worth checking out. Android Wear itself saw a more profound redesign. While I’m not sure that we need a keyboard on our smart watches, it’s good to see Google honing in on what users do most to make the experience better.
And, of course, Weird Google was on display, most notably in the ATAP presentation that saw the company announce a smart jean jacket designed with Levis for bikers, a smartwatch that you can control with radar-powered finger gestures, and a launch date for the long-awaited Project Ara modular smartphone.
Add it all up and this I/O felt like a fairly transformative show, even though there wasn’t a lot of stuff to could go try out immediately. I was hoping to get some time with Daydream or see how Allo or Google Home works, but we’ll have to wait on that for now. By the end of the year, though, the way Google and its users interact with each other will look a lot different.
Photography by Chris Velazco.
Google is making mobile search more visual with rich cards
Google’s new “rich cards” format will make googling on your phone a more visual experience than what you’re used to. It’s sort of an evolved version of rich snippets, those search results that come with small images and a short sample of the web page’s text, though it’s not supposed to replace the older format altogether. The company is rolling out the feature for recipes and movies first. So, if you’re doing a search for, say, X-Men: Apocalypse or a recipe for chocolate pie, you might see a carousel of cards right on top of the results page that can scroll sideways. For now, you can only encounter rich cards if you’re using the English version of Google.com, but the company will likely roll it out for more categories and languages in the future.
Source: Google
‘Woorld’ makes a strong argument for weird Project Tango apps
It’s not hard to see how Google’s Project Tango can be utilitarian. Need directions through a crowded mall? Easy. Want to learn more about art installations as you wander through a museum? Done. What’s easier to miss is just how weird things can get when you’re holding a device that can sense the very environment around you, but Funomena’s new Tango game Woorld serves as a pretty good reminder.
In case you hadn’t heard, Funomena is an indie game studio in San Francisco that counts Katamari Damacy creator Keita Takahashi among its ranks. Gamers probably know exactly what the means for Woorld: it’s equal parts adorable and strange. In a nutshell, you’ll use a Tango device to scan your surroundings — the floor, walls, and even ceiling if it isn’t too high. That initial sweep defines the realm of a tiny little world, where you place objects like plants, faucets, houses, moons and more. Why? Partially just because you can, but also to make the world — as viewed through a screen anyway — a little more beautiful.

See, unlike the Katamari Damacy series, or the more obtuse Noby Noby Boy, there doesn’t seem to be an overarching goal in Woorld. There’s an exploration mode (that we weren’t allowed to play with) that basically helps you wrap your head around the arithmetic of these objects — placing a cloud in the air and making it rain on a sprout causes the tiny plant to grow, and so on. Most of the time though, you’ll be hanging out in a sandbox mode, free to place objects where you like and see how your tiny virtual world comes together. There might be more to the game — Google didn’t have much information on how the final product would turn out — but at least we won’t have to wait too long to find out.
The first consumer Project Tango device is set to launch in just a few weeks, but developers — like Takahashi and Funomena — have had access to development devices for months. With any luck, that means people have been toiling on similarly off-the-wall stuff to give Project Tango hardware a more profound reason to exist. Navigating about learning more about the world around us is great and all, but I can’t wait to start seeing Tango apps that take the world around us and turn it on its ear.
Google and Levis are releasing their smart jacket early next year
Google and Levis announced a partnership at Google I/O last year that would bring “smart clothing” to the market using a technology codenamed Project Jacquard. The tech, which is basically composed of conductive fabric woven into the garment to create an interactive patch that senses touch, pressure and even your hand’s position before you touch the fabric. It’s a wild idea, and this year Google’s Advanced Technology and Products (ATAP) group is showing it off in an upcoming product: the Levis Commuter jacket with Jacquard technology built right in.
As explained at Google I/O this morning, the jacket contains a weave of the Jacquard interactive threading on the left arm, and there’s a little Bluetooth-enabled loop you connect to the cuff of your jacket. That cuff lets your phone talk to the jacket, and you can configure exactly what gestures you want to work with which apps.

The on-stage demo showed that you can swipe to adjust the volume on your music, tap to change tracks, and use another gesture to get navigation directions from Google Maps. The idea is the built in interactions that bikers can use to control their phones safely while riding, as the Commuter jacket was originally designed as a biking jacket.

The jacket itself is made using Levis’ standard manufacturing techniques and it doesn’t need to be treated with any special care. It can be washed and worn and treated like any other garment; you just need to remove the Bluetooth cuff before washing it. And it’s just the first garment that Levis is making with this tech — the company wants to make athletic and business wear as well.
Developers will have access to a host of APIs to make their apps work with Jacquard, and the hopes are that many apps will work with the garment by the time it ships. Right now it works with your calls and messaging apps, Google Play Music and Maps and third-party apps from Spotify and Strava.
The first Project Jacquard garment will ship from Levis in the spring 2017, but there’s a “beta test” that will launch this fall. If it works as well as it seemed to in today’s quick demo, it’ll definitely be worth keeping an eye on.

For all the latest news and updates from Google I/O 2016, follow along here.
Google aims to launch its consumer Project Ara phone in 2017
Despite some grim portents last year, Google’s “Project Ara” modular smartphone is far from dead. For starters, it now has its own business unit within Google’s mysterious Advanced Technology and Projects (ATAP) group, and the nearly magical modular hardware we’ve been anticipating for years is indeed getting closer. ATAP Head of Creative Blaise Bertrand confirmed at Google’s I/O conference today that a new developer phone will be available in Q4 of this year with a “thin, light, beautiful” consumer Ara phone to follow in 2017.
Considering the ATAP team’s early progress, it seemed for a while that the predicted launch in 2016 was still a possibility. Still, some time away from the spotlight seems to have done the entire project a lot of good: The developer version showed off on stage today was considerably sleeker than iterations we’ve seen in the past. The modules used to be separated from each other by a thick metal grid, for instance, a technical necessity that didn’t look so great. Now, there’s essentially no boundary between the modules, lending the phone a more unified — if still eclectic — look. The first batch of modules seem mostly flush with each other (remember the old Ara’s camera hump?), which only helps the phone look more premium.
The new Ara frame supports up to six modules, which form what Google calls the “world’s first UniPro network” with the phone. Don’t worry: Future frames can be larger or smaller, and future modules will also be compatible with earlier Ara devices. Most importantly, it seems much, much easier to install modules and start using them. All you have to do is plug one in and you’re good to go — the process doesn’t even require a reboot. Ejecting them is simple too, if a little more involved: You’ll have to jump into the settings app and select which module you want to eject. Once done, the module physically pops up from the back of the device. You’ll even be able to eject modules with voice commands like “OK Google, eject the camera.”

Maybe the biggest thing to remember is that Project Ara is only as good as the modules available for it. Thankfully, the ATAP team still has a considerable number of partners either interested in, or actively developing, such smartphone add-ons. The list for now includes E Ink (for tiny secondary displays), Toshiba (maybe for additional storage?), Gotenna (for potential off-the-grid communications) and Sony Pictures Home Entertainment (for God knows what). Alas, developers who sign up for their pre-release kits might not get to play with many of these — we’re told the dev units only ship with a few modules.
For all the latest news and updates from Google I/O 2016, follow along here.
Google controls a smartwatch with radar-powered finger gestures
Last year, Google gave us a taste of Project Soli, an effort to deliver radar-powered finger gesture control to wearables. Today, we got a closer look at how it could be implemented in real products, starting with a customized LG Urbane smartwatch. Google reps were able to control the watch simply by holding their fingers in front of it. As they moved closer, even more options opened up. As they moved away, the standard watch face returned. It’s just a demo right now, but Soli could solve the problem of controlling smart devices with tiny screens (or with no screens at all).
“We’ve developed a vision where the hand is the only controller you need,” said Ivan Poupyrev, a technical program lead at Google’s Advanced Technology and Projects (ATAP) group. “One moment it’s a virtual dial, or slider, or a button.” Basically, Google is trying to create a whole new gesture language for every device in your home.
After rolling out a Project Soli alpha developer kit last year, Google selected 60 developers from a pool of 180 applicants to show off how their implementations. One group used Soli to used it to identify materials like copper, while another used it for 3D imaging. The coolest experiment, though, was using it as an in-car remote control. Imagine using gesture controls in your car by just having to raise your fingers from the steering wheel a bit.
While Google’s initial Soli dev kit worked, it was a bit of a pain. Lawyers made the Soli group put a warning on the back of the kits, because they drew an insane amount of power. They also had to be connected to powerful desktops to work. Over the past year, Google set about refining Soli’s design so it can be implemented anywhere.
“If you can make something work on a smartwatch, you can make it run any way you want,” Poupyrev said.
Together with Infineon, Google reduced Soli’s power consumption 22x, from 1.2 watts to 0.054 watts. They were able to make it run on standard mobile chips like Qualcomm’s Snapdragon 400 and a recent Intel Atom chip by optimizing their code by 256x. And despite those tweaks, Soli still managed an 18,000 FPS radar rate.

Google initially had the Soli chip on the watchband of the LG Urbane, but that still looked pretty clunky. With some help from LG, they were able to fit it into a small area right below its screen. What’s really intriguing is the fine amount of control Soli’s gestures were able to detect — it can even tell when you’re rubbing your fingers together.
The Soli team also implemented its technology in a JBL speaker, which recognized larger finger and hand gestures. It lit up as someone’s hand drew near, and it was able to skip tracks with a simple thumbs-up gesture. It’s an example of how Soli could be used to control smart home devices from afar, without touching them.
So what’s next for Soli? We can expect to see more experimental implementations over time. And next year, Google will roll out a beta version of the Soli dev kit, which looks significantly smaller than what devs have today. It’ll be a while until this technology reaches typical consumer products, but Google’s progress over the past year is impressive all the same.
For all the latest news and updates from Google I/O 2016, follow along here.
Source: Project Soli
Google wants make it easier to craft apps that go big
The Google Play Store serves over a billion users globally a month, so the potential for an app to go big is nothing to sneeze at. Alas, the chances of that actually happening is a different story, which is why Google has released a bunch of updates to help developers craft apps to make more of an impact.
First off, Google is making it easier for would-be beta testers to find non-final software in the Play Store. For the first time, open beta apps will show up in your Play Store search results, with the most promising betas getting some spotlight in a new Early Access section. This latter effort is part of a bigger push to make it easy for users to find new, valuable apps — that same rationale is why Google is rolling out Collections in the Play Store. Looking for real estate apps, or apps that are great for young ones? Your search should become easier very shortly.
These changes are welcome additions for users, but most of the big news today is for developers who want to reach bigger audiences. New to the Play fold is a set of guidelines called “Building for Billions” meant to give devs insight into crafting apps for crucial emerging markets. And to help apps feel just a little neater internationally, the Play Store will automatically round prices converted between currencies. After all, a $1.99 app seems pretty normal in the US store, but seeing the equivalent ¥218.12 in the Japanese store lacks a certain panache: Now the Play Store will round it to ¥200.
For all the latest news and updates from Google I/O 2016, follow along here.



