USEMP Computing Now editorial published

USEMP team has published a new editorial titled “Privacy Awareness and User Empowerment in Online Social Networking Settings” for Computing Now magazine January 2015 edition. Symeon Papadopoulos (CERTH) and Adrian Popescu (CEA) have described the technical challenges and state of the art in online privacy awareness for Online Social Networks research (examples include: homophily principle, location mining from image content, usability challenges) based on USEMP research work. David Lund (HWC) and Sanja Ilic (VELTI) from USEMP industrial partners have provided valuable insights on the industry perspectives on privacy and user empowerment. The editorial along with related video material is available online at Computing Now 2015 edition.

David Lund video (HWC) on industry perspectives for online privacy awareness

Sanja Ilic video (VELTI) on industry perspectives for online privacy awareness

 

Secret Research Guidelines for A/B Testing and the More

Research on Facebook data

Great to see that such guidelines are considered, though from the perspective of the guineapigs (us) we need both consent and transparency, even if the data is anonymized. The reasons are, first, that methodological rigor requires that the findings can be tested by peer researchers, and, second, that those to whom the inferred knowledge will be applied will be affected without having an inkling how their minds and moods are being nudged.

The secret system controlling your Facebook News Feed

Find your friends <i>(Image: Ed Ou/Getty Images)</i>31 July 2014 by Hal Hodson

No one really knows exactly how Facebook decides what we see when we log in. Reverse-engineering the algorithm behind it could help us find out

WHO controls your Facebook News Feed? We are fed a specially selected diet of jokes, photos and gossip from our Facebook friends, but not by a person. Instead an algorithm does the work – giving it the power to influence us.

The furore over an experiment in which Facebook researchers attempted to manipulate users’ emotions via their News Feed, albeit only slightly, highlighted the extent of that power.

Facebook’s algorithms are a closely guarded secret. “These are black boxes,” says Christo Wilson of Northeastern University in Boston. “In many cases the algorithms are held up as trade secrets, so there’s a competitive advantage to remaining non-transparent.”

For Karrie Karahalios and Cedric Langbort at the University of Illinois and Christian Sandvig at the University of Michigan, Facebook’s influence is out of balance with our understanding of how its algorithm works. So they are carrying out what they call a collaborative audit, looking at the Facebook experiences of thousands of people to work out the underlying algorithmic rules.

To do this they have created an app called FeedVis, which creates a stream of everything that your friends are posting. When I tried it, I saw an endless stream of comments, likes and posts by friends I’d forgotten I had. To the right I saw my standard News Feed, which was empty by comparison.

In their first, small study using FeedVis, the team found that most people – 62 per cent – didn’t know that the News Feed is automatically curated. People were shocked that they weren’t seeing everything their network posted. In cases where posts of close friends or family were excluded, many became upset.

The team is starting to understand some of the basic rules that govern what people see. “We know that if you comment on someone’s wall, you’re more likely to see a post from them than if you just like something,” says Karahalios. “And if you go to a person’s timeline you’re more likely to see content from them later.” The work was presented at the Berkman Center at Harvard University last week.

But Facebook’s algorithms change constantly. “Even if I figure it out today, that doesn’t necessarily mean it’ll be like that tomorrow,” says Wilson.

To expand the experiment, the team will recreate a person’s profile based on their likes, comments and other Facebook activity and then see if they can detect patterns in what their News Feed shows them.

Already, Facebook appropriates its users’ profiles to create adverts on their friends’ feeds that look like normal content. There are other tricks, too. “I could share a link to the McDonald’s website, commenting that a McLobster sounds disgusting,” says Sandvig. If you like that link, Facebook registers that you like McDonald’s. “It doesn’t appear on your feed, but your friends will get ads that say ‘Hal likes McDonald’s’,” he says.

Understanding these dynamics is crucial, as Facebook is increasingly the tool that people use to communicate and find out about their world. “In the history of mass media, there have been channels with huge reach, but it’s typically a human in the apex of the control loop,” says Wilson. “That’s just not true any more.”

This article appeared in print under the headline “Facebook’s biggest secret”

Source: http://www.newscientist.com/article/mg22329804.200-the-secret-system-controlling-your-facebook-news-feed.html#.VAbXsPmSzs6

10482813_810491678970799_9099347520384422729_n

USEMP presents its architecture in CAiSE 2014 conference industrial panel

VELTI  has presented USEMP approach to the value of personal data to the 26th International Conference on Advanced Information Systems Engineering (CAiSE 2014), held in Thessaloniki, Greece, June 16-20, 2014! The presentation focused on USEMP use cases, USEMP approach to evaluating personal data value and presented to the community USEMP LIO platform architecture. The slides are available here. There is also multimedia material  hosted kindly by FP7 Social Sensor project here.