Quantcast
Channel: Transformative learning » university rankings
Viewing all articles
Browse latest Browse all 2

The Black Market for Facebook “Likes,” and What It Means for Citations and Alt-Metrics

$
0
0

The Black Market for Facebook “Likes,” and What It Means for Citations and Alt-Metrics

This is a follow-up on my earlier posted ‘publish or perish’ post. It is inspired by Phil Davis who posted an interesting article in the ‘Scholarly Kitchen’ today which only amplifies some of the concerns expressed in my initial post. Here are the opening lines of his article which can be found in its complete form at: http://scholarlykitchen.sspnet.org/2012/05/18/the-black-market-for-facebook-likes/

“There is an online market for so many intangible goods these days that it should come as no surprise that there is a market for Facebook “Likes” — the little thumbs-up rating that accompanies so many products and services we see on a daily basis.

For $75, a marketing company will sell you 1,000 Facebook “Likes,” according to NPR’s Planet Money. Only the marketing company does not supply the “likes” but works as a broker between real individuals who are willing to sell their online preferences to your product for very small sums of money — ten cents a “like” — and those who wish to artificially inflate their prestige.

Ten cents may not seem like a lot of money, but there is a huge workforce of individuals willing to be employed to undertake low-skilled, repetitive online work for pennies a task, as evidenced by mature markets like Amazon’s Mechanical Turk. Global outsourcing has never been easier.”

And later on Phil writes: “The artificial trust market is not new and is found in environments where online trust is important, such as purchasing an antique love seat from a complete stranger on eBay, finding a reputable bed and breakfast in rural Ireland, selecting a new e-book from Amazon, or choosing an app from the Apple Store. When in doubt, our tendency is to turn to the wisdom of the crowds because we believe that these ratings are accurate evaluations generated by honest individuals and based on real experiences.

Trust — or at least consensus — works the same way in scientific publication through the accumulation of citations, only the barriers to participate in this market are much, much higher. To cast your votes, you need to publish a paper that is indexed by Thomson Reuters’ Web of Science (or alternatively, Elsevier’s Scopus). Like Facebook, Thomson Reuters does not take kindly with citation manipulation and will delist a journal when it exhibits forms of citation manipulation such as systemic self-citation or, more recently, through the formation of citation cartels.”

He then refers to my earlier post where I suggest a website like PleaseCiteMe.Com where ‘academics’ can ‘purchase’ citations to increase their h-factor… see my original post below.

“What’s your h-factor?” is a question that is increasingly asked at gatherings of scientists or during interviews for academic positions. Scientific careers depend on h-factors these days. What am I talking about?

The h-index is an index that attempts to measure both the productivity and impact of the published work of a scientist or scholar. The index is based on the set of the scientist’s most cited papers and the number of citations that they have received in other publications. The index can also be applied to the productivity and impact of a group of scientists, such as a department or university or country.

A scientist has index h if h of his/her Np papers have at least h citations each, and the other (Np − h) papers have no more than h citations each.

In other words, a scholar with an index of h has published h papers each of which has been cited in other papers at least h times. Thus, the h-index reflects both the number of publications and the number of citations per publication. (source: wikipedia)

An important determinant of the hight of one’s h-factor depends on what counts as a ‘ cite-able publication’. In the Web of Science h-factor only scientific articles published in journals which have ISI-recognition (determined by Thomson-Reuters) are considered citeable (so not articles in other journals, chapters in books, etc.). In the Scopus h-factor a larger pool of journals is included, while in google citations and in ‘publish or perish’ (www.harzing.com/pop.htm) the h-factor is likely to be higher as it also considers articles in a wide range of journals, book chapters and reports as cite-able). In my university, Wageningen University, it’s not your x-factor that matters but your h-factor. Our librarians have become information specialists that have expertise in bibliometrics and scientometrics. Such expertise is pivotal in helping our academics, science groups and, indeed, our university (76th on the Times Higher Education Index…) climb the rankings. Biblio-what?

Bibliometrics and scientometrics are two closely related approaches to measuring scientific publications and science in general, respectively.  In practice, much of the work that falls under this header involves various types of citation analysis, which looks at how scholars cite one another in publications  (source: Eric Meijers via www. microsites.oii.ox.ac.uk/tidsr/kb/48/what-bibliometrics-and-scientometrics

Below you will see a screenshot of my personal bibliometrics (click on the image to make bigger).

As you can see my overall h-factor is 16. Impressive? Hardly. But how can I raise it? Let me move to the crucial information google citations provides (if you want to check your own bibliometric data you need to make a profile on google citations!). You will note below that “Learning in a changing world and changing in a learning world: reflexively fumbling towards sustainability” is the crucial paper at this moment (click on the image to make bigger).

If I want to increase my h-factor to 17 then I need to get two of my papers cited 17 times or more. I could try to promote the paper that currently occupies place 16 (“reflexively fumbling”). I would then also need to find another paper that is still somewhat attractive to be cited – perhaps the 2006 paper with Justin Dillon on “The danger of blurring methods….” .

So how can I do that? There are many ways of course – I can suggest the paper to authors of papers I review for journals… or I can ask my colleagues to cite those papers… or I can make free-downloads available of those papers via my blog… but there might be a better way – one that could be the beginning of the end of this metrics-based system: hBay

Introducing: PleaseCiteMe.com

Why not develop PleaseCiteMe.com – a web-based system where people can trade citations. Scholars can post citations of their own work that they need to have cited in order to increase their h-factor. Of course there is a price to pay: the scholar will have to cite the work of the scholar on the other end who agreed to cite the work. If this is not possible then there can be a monetary value attached to a citation. Possibly citations in a Web of Science journal might cost more than citations in a Scopus journal or of a book chapter that is recognized by Google citations. Of course we need to have a few clever web-designers, economists and Mark Zuckerman-types to create all this but then by 2014 we could probably take this to Wallstreet as by then will be a huge demand in the world of science for this service.

Publish or perish…. or, perhaps, publish and perish…

So what are we to do? In the end it is not about science for impact factors or h-factors but science for impact in society. Some of us are part of a system run increasinly by bibliometric information. Playing the game may be inevitable until the game will end, when people begin to realize that people don’t read but only cite… or when people only write without having the time to read… or when strategic thinking replaces intelligent thinking, curiosity and passion for contributing something meaningful to people and planet.

Fortunately there are still academic environments driven by societal relevance, planetary responsibility and curiosity. And, indeed, there are calls for bridging science and society and other forms of reviewing quality in research (see for instance Funtowitch and Ravetz idea of the ‘ extended peer review’). More on this in another post!



Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles





Latest Images