Skip to content

Posts tagged ‘Facebook’

6
Jul

Recommended Reading: Colombia’s high-tech World Cup training and testing Facebook


Recommended Reading highlights the best long-form writing on technology and more in print and on the web. Some weeks, you’ll also find short reviews of books that we think are worth your time. We hope you enjoy the read.

Colombia v Uruguay: Round of 16 - 2014 FIFA World Cup Brazil

Colombia’s High-Tech Advantage in its World Cup Match Against Brazil
by Matt Hartigan,
Fast Company

Pocket!function(d,i)if(!d.getElementById(i))var j=d.createElement(“script”);j.id=i;j.src=”https://widgets.getpocket.com/v1/j/btn.js?v=1″;var w=d.getElementById(i);d.body.appendChild(j);(document,”pocket-btn-js”);

Colombia certainly faced an uphill climb against Brazil at the World Cup, and the side turned to tech to increase its chance of winning throughout the tournament. During training, the Colombian national team leveraged Catapult: a wearable GPS system that maps players’ bodies in three-dimensional space to gauge the “load” placed on each athlete. Among a host of other features, the sensors transmit data to coaches and staff instantly at a range of up to 250-300 feet. Soccer isn’t the only sport using the system either, as the San Antonio Spurs implemented the system to track basketball prospects ahead of last week’s NBA Draft.

The Test We Can — and Should — Run on Facebook
by Kate Crawford, The Atlantic

By now, you’ve likely heard about Facebook toying with our emotions with manipulated News Feed content. The backlash has certainly been significant, questioning the ethics of this sort of secret user testing. The Atlantic’s Kate Crawford examines questions surrounding the whole process while suggesting an opt-in model for future experimentation.

Pocket!function(d,i)if(!d.getElementById(i))var j=d.createElement(“script”);j.id=i;j.src=”https://widgets.getpocket.com/v1/j/btn.js?v=1″;var w=d.getElementById(i);d.body.appendChild(j);(document,”pocket-btn-js”);

Hospitals Are Mining Patients’ Credit Card Data to Predict Who Will Get Sick
by
Shannon Pettypiece and Jordan Robertson, Bloomberg Businessweek

Too busy with work to prepare healthy meals over regular pizza deliveries? Well, soon your doctor could give you a phone call telling you to cut it out. Carolinas HealthCare System is examining the buying habits of 2 million patients to determine those at high risk and to offer care before they report to the office.

Pocket!function(d,i)if(!d.getElementById(i))var j=d.createElement(“script”);j.id=i;j.src=”https://widgets.getpocket.com/v1/j/btn.js?v=1″;var w=d.getElementById(i);d.body.appendChild(j);(document,”pocket-btn-js”);

Razr Burn: My Month With 2004′s Most Exciting Phone
by Ashley Feinberg, Gizmodo

Back in the summer of 2004, many of us coveted the Motorola Razr V3 when it first made its debut. The ultra thin flip phone was a powerful and expensive option at the time, becoming one of the most recognizable handsets of all time. To celebrate its 10th birthday, one tech journalist opted for the Razr over an iPhone for a month, documenting the entire experience.

Pocket!function(d,i)if(!d.getElementById(i))var j=d.createElement(“script”);j.id=i;j.src=”https://widgets.getpocket.com/v1/j/btn.js?v=1″;var w=d.getElementById(i);d.body.appendChild(j);(document,”pocket-btn-js”);

Tracking the Digital Revolution, From Pong to ‘Gravity’
by Alice Rawsthorn, New York Times

A new exhibit at London’s Barbican Center takes a look at the evolution of digital media from Pong in 1972 right up through experiments from designers, coders and others today. The New York TImes offers a brief synopsis of the collection, including the technical challenges of showcasing tech from the ’80s and ’90s as they were originally intended.

Pocket!function(d,i)if(!d.getElementById(i))var j=d.createElement(“script”);j.id=i;j.src=”https://widgets.getpocket.com/v1/j/btn.js?v=1″;var w=d.getElementById(i);d.body.appendChild(j);(document,”pocket-btn-js”);

[Photo credit: Clive Rose/Getty Images]

Filed under: Misc

Comments

4
Jul

China’s scalpers force Oculus to suspend Rift sales


With over 100,000 developer kit sales logged in its docket, it’s fair to say interest in the Oculus Rift is high. While we wait for the inevitable release of the consumer model, scalpers in China snapping up developer versions at such a rate that the Facebook-owned company has been forced to suspend sales in the country. According to comments made by an Oculus representative on Reddit, the VR specialist was seeing “extreme reseller purchases,” which were presumably sold at an unhealthy markup and took stock away from legitimate developers. While the company’s DK2 headset is making its way to buyers, it’s considered an in-development version of Rift and isn’t intended for consumers.

How bad was the reselling in China? “We were forced to suspend an entire country from purchasing,” says this Oculus employee. “I’ll let you put two and two together.” The good news is that the company is making it a priority to look into an alternative sales process, allowing Chinese developers to create slick VR experiences for the rest of us when the Rift finally gets its public release.

Filed under: Gaming, Wearables, HD, Facebook

Comments

Via: Eurogamer

Source: r/Oculus (Reddit)

4
Jul

Facebook Messenger gets a native iPad version


Facebook Messenger for iPad

Facebook Messenger makes the most sense on a phone, but plenty of people still want to chat on their tablets — and apparently, Facebook knows it. The social network has just updated Messenger for iOS to support the iPad, letting you carry on a conversation without having to either dig through chats in the main Facebook app or rely on third-party titles. The iPad interface mostly behaves like a super-sized version of what you see on an iPhone, although you miss out on a handful of recent feature additions, like tap-and-hold video capture and the split-screen selfie mode. There’s no corresponding native interface for Android tablets just yet, although the iPad refresh suggests that one might be on the horizon.

Filed under: Tablets, Internet, Facebook

Comments

Via: TechCrunch

Source: App Store

3
Jul

Former researcher says Facebook’s behavioral experiments had ‘few limits’


Facebook’s still trying to brush off that whole psychological study with unaware users thing, but according to a former team member and outside researchers, the social network’s data science department has had (changes have been promised) surprisingly free rein over how it polled and tweaked the site. Andrew Ledvina, who worked as a Facebook data scientist from February 2012 to July 2013, told the WSJ: “Anyone on that team could run a test.”

In 2010, the research team gauged how “political mobilization messages” delivered to 61 million users affected voting in the US congressional elections that year, while in 2012, thousands of users received a warning that they were being locked out of Facebook. While the reason given was that it believed they were bots or using a fake name, the network already knew that the majority were real users — it was apparently all in aid of improving Facebook’s anti-fraud system.

Since its beginnings, Facebook’s data science team has apparently run hundreds of behavioral tests. As Ledvina put it: “They’re always trying to alter peoples’ behavior.” Other tests and research are apparently less invasive: less emotional button pushing, more button testing in an effort to click on more ads and generally waste more time on the site. As the WSJ notes, Facebook isn’t the only one: Twitter, Google, Microsoft et al. also research and monitor their users.

[Image credit: AFP/Getty Images]

Filed under: Science, Internet, Facebook

Comments

Source: WSJ

3
Jul

Engadget Daily: Facebook’s emotional experiment, social media activism and more!


Today, we break down the phenomenon of social media activism, investigate Facebook’s user experiment, ponder NVIDIA’s next Shield console and get excited about the reboot of Chumby’s smart alarm. Read on for Engadget’s news highlights from the last 24 hours.

FRE 170537

What you need to know about social media activism

What do the “OccupyWallStreet” and “CancelColbert” hashtags have in common? They’re both examples of what’s been termed “social media activism.” Read on as Ben Gilbert dissects this modern form of protest and what it means to you.

Facebook used you like a lab rat and you probably don’t care

Smiles are contagious. So are depressing Facebook posts, apparently. In 2012, Zuckerberg and Co. manipulated its users’ happiness (gasp) by secretly bombarding their news feeds with waves of positive and negative stories.

Chumby’s smart alarm clock relaunches with 1,000 apps

Good news Chumby owners, the tiny smart alarm’s network is back online and better than ever. What’s more, the company’s reboot includes 1,000 apps and a more efficient Chumby performance.

NVIDIA’s Shield successor is a tablet

NVIDIA’s next Shield console might not be a “console” at all. According to a listing from the Global Certification Forum, the gaming company mistakenly leaked information about an upcoming “Shield Tablet,” including some specs.

Filed under: Misc

Comments

3
Jul

#SandbergSays: We’re really not sure.


“Meh.” It’s possible that Facebook COO Sheryl Sandberg was thinking this when speaking at an event in New Delhi. In an interview with NDTV of India, she apologized to users for toying with their emotions in a 2012 experiment. Sandberg went on to admit that the experiment was “poorly communicated.” But based on this perfectly timed picture, is she really sorry?

We have our own guesses as to what was running through Sandberg’s head, but we know you can do better. Tweet your own caption at us with the hashtag #SandbergSays and we’ll update this post with our favorite picks.

[Thanks, Robinson Meyer for sharing this picture with the Twittersphere.]

Billy Steele: “Breakfast burritos? Meh.”

Christopher Trout: “Can we just hug it out?”

Edgar Alvarez: “Drop the ‘sorry.’ Just shrug. It’s cleaner.

John Colucci: “I’m taking these people back to dial up.

Jon Fingas: “What, me worry (about scientific ethics)?”

Kris Naudus: “This dress is toying with your emotions right now.”

Sean Buckley: [this]

Timothy J. Seppala: “Meanwhile, everyone else is all like (╯°□°)╯︵ ┻━┻”

Terrence O’Brien: “I farted.”

Comments

Source: Mashable, @yayitsrob

2
Jul

Irish court ruling says defacing Facebook and physical property are the same thing


What happens on Facebook doesn’t just stay on Facebook, and your social network activity can be used against you in a court of law. Trolling, bullying and posting offensive content can all land you in hot water, not to mention that your Facebook history can be used as evidence in all kinds of criminal cases. Currently, even the US Supreme Court is trying to clarify legal accountability of social media. Now, in what’s thought to be the first prosecution of its kind, a man in Ireland has been charged with “frape” — the rather tactless term that describes defacing someone’s Facebook page from within their account.

As the Irish Times reports, the man has been fined €2,000 (just over $2,700) for posting a malicious status update on an ex-girlfriend’s account using her phone. He was earlier acquitted of more serious crimes against the woman, but plead guilty in this instance. Interestingly, the man wasn’t charged with any kind of cybercrime, but under the Criminal Damage Act, which is usually the case when physical property is involved. In the US, the equivalent charge would be something along the lines of vandalism, and only time will tell whether this sets any future precedent or remains an isolated judgement. And, if you believe you’ve been the victim of social media humiliation at home or in the workplace, contact our elite team of Facebook prank lawyers now on 1-800-JUSTLOGOUT.

Filed under: Internet, Facebook

Comments

Via: The Independent

Source: Irish Times

2
Jul

What you need to know about social media activism


Wall Street Protest Logistics

Protests in the Middle East, known as “The Arab Spring,” echoed around the world. On Friday, December 17, 2010, a fruit vendor named Mohamed Bouazizi covered himself in flammable liquid and lit a match. His body was quickly engulfed in flames and, despite attempts to save his life, Bouazizi died on January 4th, 2011. He was 26 years old. Like how Buddhist monk Thích Quảng Đức’s self-immolation in Saigon nearly 50 years earlier represented the frustration of many Vietnamese, Bouazizi’s action became symbolic of a much larger frustration in Tunisian society.

What happened next, however was a product of modern times: Word spread of Bouazizi’s action through social networks, with Facebook specifically becoming a flashpoint for protest organizations across the country. By the time Tunisia’s former leader, Zine El Abidine Ben Ali, resigned and fled the country in mid-January 2011, over a fifth of Tunisia’s population was on Facebook.

WHAT IS IT?

The term “social media activism” is ambiguous. That’s intentional, as its application varies depending on what it’s connected with. Both Occupy Wall Street and “#CancelColbert” fall under the umbrella of “social media activism,” so the term needs to be ambiguous by its nature. With those two examples, you already kinda know what it is, right? Social media activism can be as simple as a trending topic (“#CancelColbert”) for interested parties to engage in a bigger conversation, and as complex as Occupy Wall Street’s multiplatform, multimedia initiative. As the name implies, there’s no standard social network used for social media activism; YouTube, Twitter, Facebook, Sina Weibo and myriad others are employed as need be.

In the case of Tunisia, Facebook was the social service of choice, with hackers, protesters and everyday Tunisians using the service collaboratively. It served as a message board for sharing images, video and stories, in addition to creating a public forum for communication.

In response to the Santa Barbara shootings by Elliot Rodger, activists and general newsreaders alike used the “YesAllWomen” hashtag on Twitter. The hashtag is still in use over a month later, where it’s become an ongoing conversation about women’s rights versus how women are treated in reality.

WHY SHOULD I CARE?

ferdinand delacroix  1798 1863  ...

Beyond the whole “you’re a participating member of human society” thing, social media activism is a fascinating modern version of protest and communication. Because of the internet, social media platforms and the ubiquity of mobile phones with cameras, activism and protest are now truly global events. Not interested in participating? That’s fair!

The other side of the coin is that, sometimes, these movements affect your life whether you like it or not. If you were in Egypt in early 2011, whether you were part of the conversation or not didn’t matter: The president was overthrown.

WHAT’S THE ARGUMENT?

While not an “argument” per se, some say that media coverage focuses on the medium — social media — over the message, and it ends up diluting the protest. Author Malcolm Gladwell argues as much in The New Yorker: “People protested and brought down governments before Facebook was invented. They did it before the internet came along. Barely anyone in East Germany in the 1980s had a phone – and they ended up with hundreds of thousands of people in central Leipzig and brought down a regime.” Gladwell’s also questioned the efficacy of social media in organizing physical protest; it’s easy for people to participate online, but far more difficult to turn those words into action (so the argument goes).

Back in May, a tweet from The Colbert Report‘s official Twitter account made a grave error: publishing a punch line from Colbert’s show that night without including the joke’s setup. In lampooning Washington Redskins owner Dan Snyder, Colbert made the following punch line in reference to a (fake) video that proposed Colbert was caught making racist remarks about Asians. The tweet, since deleted, said this:

In response, writer/activist Suey Park created the “#CancelColbert” hashtag. It became a rallying cry for some Asian Americans to speak about their experiences with racism in America. Except that some Asian Americans — notably Deadspin‘s Tommy Craggs and Kyle Wagner — found Park’s use of “hashtag activism” only served to misdirect the original conversation away from Snyder. It’s not the first, but it’s certainly the most prominent example of social media activism that many believe to be a misuse.

WANT EVEN MORE?

Mideast Egypt

We sure hope you do, because there’s quite a bit on the subject that’s worth reading. The MIT Technology Review has a great piece from John Pollock digging in on the hackers behind Tunisia’s uprising. Al Jazeera America wrote about “#CancelColbert” and whether social media activism is effective; The New Yorker spoke with Park and discussed her background. The New York Times has a thorough background on Bouazizi and similar actions.

And finally, Jehane Noujaim‘s excellent 2013 documentary The Square both demonstrates the use of social media activism in a real-life revolution setting, and grippingly details the movement in Tahrir Square. It’s on Netflix, even! Don’t miss it!

[Image credit: AP Photo/John Minchillo (Zuccotti Park), The White House (Michelle Obama), Ferdinand Delacroix, Comedy Central, Twitter (@ColbertReport), AP Photo/Maya Alleruzzo (Facebook/Twitter)]

Filed under: Cellphones, Handhelds, Internet, Software, Facebook

Comments

2
Jul

Facebook’s awkward mood experiment under investigation in the UK


Facebook may have brushed off the furor over a psychological experiment that influenced what users saw in their feeds, but UK regulators definitely haven’t. According to the Financial Times, Britain’s Information Commissioner and the Irish Data Protection office (Facebook’s EU base is in Ireland) are probing the social network’s activities to determine if it did anything illegal. Back in 2012, Facebook changed the number of negative or positive comments that a select group of users saw in their feeds, ostensibly to gauge the effect on their moods. Contrary to its expectations, people who saw the negative comments were more inclined to write positive posts, and vice versa. While it apologized, Facebook also tried to justify the experiment by saying it benefited users and didn’t compromise anyone’s privacy. Still, when a UK politician told the Guardian that “if there is not already legislation on this, then there should be,” it didn’t seem the matter would quietly go away.

[Image credit: Getty/AFP]

Filed under: Internet, Facebook

Comments

Via: Bloomberg

Source: The Financial Times

2
Jul

Facebook used you like a lab rat and you probably don’t care


Facebook Said to Plan IPO Filing for as Early as Coming Week

Companies perform A/B testing — minor site variants to see what users like or don’t like — all the time. Twitter does it with its experimental features, and sites like ours tweak designs for a sample of users to see which ones they prefer. In January 2012, researchers at Facebook did something like that too. When people heard about it last week, however, they were outraged. Facebook, in the course of the study, messed with users’ emotions without explicitly letting them know about it. But as outrageous as that is, it likely won’t make a difference.

In the span of seven days, researchers rejiggered the News Feeds of 689,000 users to surface either more positively or negatively worded stories to the top. The study found that users who saw the positive stories were more likely to write more positive words in their own posts, and users who saw negative ones were more likely to write negative words. According to the published paper in the Proceedings of the National Academy of Sciences, the study found that “emotional states can be transferred to others via emotional contagion” and that it can happen “without direct interaction between people.”

Let’s face it: Most people don’t read policies and terms of service before agreeing to them, and even if they did, the terms are pretty difficult to understand

It seems like a relatively innocuous study, right? Even Adam Kramer, the study’s author, wrote that the impact of the study was fairly minimal. But this experiment goes beyond the pale, for several reasons. For one thing, we didn’t know it was happening. The American Psychological Association (APA) states in its Code of Conduct that in the process of doing psychological research with human beings, informed consent is required — it needs to be offered in a “language that is reasonably understandable to that person or persons.” The part of Facebook’s Data Use Policy that seems to allude to this states that the company would use your information “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

SWEDEN-FACEBOOK-DATA-CENTER-SERVERS

According to Forbes, however, this particular language didn’t even appear in the agreement until four months after the study took place. And, let’s face it: Most people don’t read policies and terms of service before agreeing to them, and even if they did, the terms are pretty difficult to understand. Plus, that sentence is vague enough that it doesn’t convey the possibility of a psychological study. It’s logical to assume that the “research” stated here alludes to something harmless — like making a button red instead of blue rather than studies that probe into the inner workings of your mind. That’s not “informed consent” as the APA defines it, even if Facebook claims that it underwent a strong “internal review” process.

It’s bad enough that the study occurred without Facebook users’ permission. But it didn’t just observe users’ actions — it intentionally meddled with their emotions. When we go on Facebook, we generally expect to catch up on our friends’ lives unencumbered by any emotional sleight of hand. Sure, the advertising on Facebook is a form of emotional manipulation too, but many of us understand what we’re getting into when we see an ad — we expect to be pandered to and cajoled. We don’t expect that same manipulation in our regular News Feed.

A local review board had approved the methodology “on the grounds that Facebook apparently manipulates people’s News Feeds all the time.”

But — and here’s the part that many people don’t necessarily realize — Facebook has been messing with your News Feed anyway. Susan Fiske, a Princeton University professor who edited the study for publication, told The Atlantic that a local institutional review board had approved the methodology “on the grounds that Facebook apparently manipulates people’s News Feeds all the time.” And she’s right — your News Feed is filtered based on a variety of factors so that some stories float to the top, while others don’t. It’s all part of Facebook’s unique News Feed algorithm that intends to surface the “right content to the right people at the right time” so that you don’t miss out on stories that matter to you. So, for example, you’ll see a best friend’s wedding photos over what a distant relative said she was having for lunch if your behavior on Facebook leads it that way.

US-FACEBOOK-MENLO PARK

In a way, the algorithm makes sense. According to Facebook, there are on average 1,500 potential stories every time you visit your News Feed and it’s easy for important and relevant posts to get lost in the mix if you have to sift through it all. And from Facebook’s perspective, surfacing more pertinent stories will also get you to stick around and engage more, and maybe help the company get more ad impressions in the process. The flip side, of course, is that Facebook is actually deciding what to show to you. Most of us probably don’t really care about this because we’re usually unaware of it, and as it’s actually beneficial at times. But sorting out posts just because they’re positive or negative is taking it too far. It turns us from customers into lab rats. Yet, we’re all so used to this sort of manipulation that many of us probably never noticed.

In response to the negative reactions that the study caused, Kramer said in his post that the company’s internal review practices would incorporate some of the lessons it’s learned from the reaction to the study. Facebook also sent us the following statement:

“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

Facebook’s mea culpa is certainly appreciated, but it still doesn’t quite resolve the biggest pain point: The experiment altered our moods without our consent. Also, let’s not forget that Facebook has messed up with privacy issues before — one of the more famous examples is the company’s Beacon program, where it broadcasted your online shopping habits without your knowledge. This isn’t exactly a company that can afford any further damages to its reputation. The firm has certainly made strides in recent years to show it’s committed to user privacy by defaulting posts to friends only and making privacy options clearer. But it only takes a mistake like this to have everyone question their allegiance to Facebook again.

Facebook’s mea culpa is appreciated, but it doesn’t quite resolve the biggest pain point: The experiment altered our moods without our consent.

Or will it? The fact is that even with this controversial study revealed, most people will still continue to use Facebook. The company continues to grow — it went from a million users in 2004 to almost 1.2 billion in 2013 — despite the multiple privacy faux pas throughout the years. The social network has commanded such a loyal and dedicated following that none of these breaches in public trust have seriously damaged it. Most people just don’t seem to care that their feeds are being manipulated, with or without their consent, as long as they still get to play Candy Crush Saga and see photos of their grandkids. After all, if you really cared about controlling your privacy, you’d look into getting off the internet entirely.

[Image credit: Bloomberg via Getty Images, AFP/Getty Images]

Filed under: Internet, Facebook

Comments