You are here

Rank stupidity?

Oct 20,2014 - Last updated at Oct 20,2014

Cite your sources.”

The admonition rings in the ear of every slapdash undergraduate and corner-cutting postdoc. But have we taken the emphasis on citation too far?

From an early age, we are taught to acknowledge those whose ideas and insights have shaped our thinking. During our academic careers, we learn to provide correct attribution for the existing words, data or images that we are using.

And “credit where credit is due” is the axial principle around which the entire scholarly publication system revolves.

In academia, citing the work of others is not a mere courtesy; it is a normative requirement.

Indeed, it is almost impossible to imagine an article being published in a reputable journal without an accompanying list of references. Plagiarism is one of the few acts that can bring a glittering academic career to a halt.

Citations contextualise research and help orient the reader. They allow the reader to evaluate the author’s work on the basis of the perceived quality of the selected references. And they enable the reader to track down previously unknown but potentially useful work.

Eugene Garfield understood this.

In 1955, Garfield created the Science Citation Index (SCI), a database containing all the cited references across the most highly respected scientific journals, thereby capturing the sprawling web of connections among texts.

As he put it, “by using the authors’ references in compiling the citation index, we are in reality utilising an army of indexers, for every time an author makes a reference, he is in effect indexing that work from his point of view”.

The SCI would enable researchers to follow chains of knowledge backward and forward along the citation links embedded in scientific literature.

The SCI’s potential was not lost on the scientific community, whose members quickly adopted it — but not for the reasons one might expect.

The enrichment of the subject matter with previous analyses, connections and conclusions regarding the same cited texts was certainly part of the attraction.

More appealing, however, was the possibility of tracking the scholarly influence of oneself and others over time and across fields, and identifying the most highly cited scientists, papers, journals and institutions.

Almost overnight, the humble bibliographic reference acquired symbolic significance, and science gained a scorecard.

But did science really need one?

The SCI gave rise to multiple citation-based measures, with two, in particular, warranting mention.

The first — another brainchild of Garfield’s — is impact factor (IF), which offers a putative indication of an academic journal’s quality based on the average number of times its articles were cited during the previous two years.

A high impact factor instantly boosts a journal’s prestige.

Another notable measure is the h-index — conceived by the physicist Jorge Hirsch — which aims to measure scholars’ productivity and impact.

H-index accounting is straightforward: if a researcher publishes 20 papers that have each been cited at least 20 times, she has an h-index of 20. If she publishes 34 papers, each cited at least 34 times, she earns an h-index of 34.

Despite the well-documented limitations of these measures, their simplicity — not to mention the competition inherent to scientific progress — makes them widely appealing.

As a result, they have been deemed meaningful by people who should know better.

Indeed, citation data have become the vital statistics of academia, with researchers routinely including IF data and h-indexes — along with raw citation scores generated from sources like Thomson Reuters’ Web of Science (Garfield’s database), Elsevier’s Scopus, and Google Scholar — on their curricula vitae. 

Likewise, several annual university rankings — including the CWTS Leiden Ranking, the Shanghai Academic Ranking of World Universities, QS World University Rankings and the World University Rankings — rely on publication and citation data in their calculations.

University presidents must work to boost their institutions’ citation records, even though they know that the validity and reliability of these data, and the rankings that they inform, are questionable.

The problem is not limited to academia.

Administrators are using such measures to assess the productivity of those they hire and fund, and track the downstream impact of the research and development projects they underwrite, with little regard for the limitations of such indices.

In countries like the United Kingdom, Australia, Germany and Italy, research-assessment exercises are inexorably creating a culture of quantification and accountability in which citation data plays an increasingly important role.

The more these “objective” indicators are used in research assessment and personnel evaluation the more scientists feel obliged to play the citation game.

Increasingly, that means gaming the system, by focusing on work that promises short-term yields, pursuing “hot” research topics, spending more time on self-promotion (facilitated by the proliferation of social media), and slicing and dicing their work to attract maximum attention.

The recent emergence of so-called “alternative metrics” (such as downloads, recommendations, Facebook likes, and Tweets) has intensified the pressure on researchers to stockpile multidimensional evidence of their influence.

To be sure, the application of social analytics to the world of research and scholarship may yet provide important insights that make it easier to assess a scholar’s “true” contributions.

The challenge will be to manage the trade-off between transparency and triviality.

As Einstein purportedly said: “Not everything that can be counted counts, and not everything that counts can be counted.” 

The writer is professor emeritus of information science at Indiana University Bloomington, and honorary professor at City University London. ©Project Syndicate/Institute for Human Sciences, 2014. www.project-syndicate.org

up
91 users have voted.


Newsletter

Get top stories and blog posts emailed to you each day.

PDF