Social media turning back on itself

There are two posts which were published in the last day or so which highlight some of the challenges facing social media sites. The first is a post titled "The Spam Farms of the Social Web" by Niall Kennedy and the other is a post by Steve Rubel titled "Fake News Story Games Thousands of Digg Users". While both posts highlight slightly different challenges, they are both important because they shine a light on the flip side of this culture of sharing which pervades the so-called "Social Web".

Niall’s post is a report back on his investigation into an odd result on the Digg home page. For those not familiar with Digg, it is a social news site where users like you and me submit posts and other items which may be of interest to the community and everybody gets to vote on the submissions. The more popular submissions generally make it to the front page and are regarded as being representative of the popularity of those submissions and their general newsworthiness. Having your site mentioned on the front page is a lot like being Slashdotted in that your site will likely be inundated with visitors who link to your site from the link in the submission on the Digg page. This, of course, means loads of hits. If you happen to be running ads or some other form of affiliate program on your site those hits can translate into more money because there are more people visiting your site and therefore there is a greater likelihood that people will click on those ads or banners and earn you some cash.

So Niall noticed something odd going on on Digg and decided to investigate:

Last weekend I noticed a Digg submission about weight loss tips had climbed the site’s front page, earning a covetous position in the top 5 technology stories of the moment. The 13 sure-fire tips were authored by "Dental Geek" and posted to the "Discount Dental Plan" category on his WordPress blog. Scanning the sidebar links and adjacent content it was obvious this content was out of place on a page optimized for dental insurance. The webmaster of had inserted some Digg bait, seeded a few social bookmarking services, and waited for links and page views to roll in, creating a new node in a spam farm fueled by high-paying affiliate programs and identity collection for resale.

His investigation uncovered a scheme to take advantage of the benefits of social media (the huge number of links to and from sites and services like MySpace, Digg and others) to boost traffic to sites using particularly lucrative affiliate schemes and therefore to artificially generate greater income through those programs. Basically this is spam in a different form and it uses the very aspects of the Social Web that make it such a wonderful platform for social interaction.

Steve Rubel took a look at a similar issue using a different means. He took a look at a post on The Mu Life about a totally bogus article published by Reuters and submitted to Digg about a fictional recall of the Sony Playstation 3 because of an alleged fatal flaw in its graphics processor. The post was Dugg (voted on) over 900 times before it was eventually removed from Digg. The problem with this post is that it was not factually accurate and yet people Dugg the post because it seemed interesting or appealed to some bias they had for or against the Sony Playstation 3.

One of the ideas behind voting on stories submitted to Digg is to promote the more newsworthy stories to the front page. Digg is a social news site that relies on users’ views and opinions to determine the importance of news items submitted to the site. The problem with this, as Rubel points out, is the overemphasis on popularity rather than quality:

All of this points to a real problem in the social media world. The only yardsticks we use to measure the trustworthiness of a source are purely based on popularity – e.g. in-bound links, votes, etc. Now often popularity and quality are closely aligned. However, both of these incidents demonstrate that the current system isn’t working. We need more.

So what now? I imagine things carry on pretty much as they have been and I would like to see some form of peer review like the kind of process of self-regulation kick in where users will start to correct the abuses of the systems in place to ensure greater social interaction on the Web. There will always be some degree of abuse. It is really a matter of how sophisticated the Web’s users will become and how much we will tolerate these abuses.

Tags: , , , , , ,






What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: