Categories
Blogs Info Kampus Pemeringkatan Institusi SISFO Kampus

Raising citations with certain engineering makes the h-index data go up but it’s fake

The cobra effect, Indonesian lecturers obsessed with the Scopus index and despicable practices towards world-class universities

Want to be a world-class university, but the research citation is self-quoted.
Tama2u/Shutterstock

Rizqy Amelia Zein , Airlangga University

President Joko Widodo has frequently criticized the performance of the Ministry of Research, Technology and Higher Education (Kemristekdikti) regarding the competitiveness of universities in Indonesia which he considers to be unsatisfactory.

Finally, last October, Jokowi wondered why only three universities (PT) had made it into the 2018 world’s top 500 universities according to Quacquarelli Symonds (QS) . The President even questioned the management of universities that were unable to respond to global demands.

In the midst of the university internationalization policy that is being promoted, the Ministry of Research recently released the findings of a number of publication ethics violations committed by researchers, managers of scientific periodicals and PTN managers.

The publication ethics violations found included multiple publications, inappropriate self -citations , and the policy of publishing scientific papers without a disciplined review process.

The Research Ministry’s Credit Score Assessment Team (PAK), for example, found that a researcher could produce 69 scientific papers and 239 citations in a year, so there were allegations that the researcher quoted his own work in an inappropriate way.

This violation was carried out by the lecturers to ‘polish’ performance achievements so they could go international.

Seeing this trend, I feel that the policies that have been implemented by the Ministry of Research regarding the internationalization of universities need to be re-evaluated.

The rating system is problematic

The QS ranking is actually only one of several global rating agencies that are used as a reference by the Ministry of Research to evaluate the performance of PT. Until now, I have not found an official policy document that can be accessed by the public regarding the reasons why this ministry chose QS, not the Academic Ranking of World University (ARWU), Times Higher Education (THE) or Round University Rankings (RUR), as a performance benchmark.

QS ranks its ranking with a very complex method with six assessment indicators . The two indicators with the highest weight are (1) academic peer reviews and (2) the number of citations per faculty . Other indicators are (3) comparison of the number of lecturers and students, (4) assessment of the reputation of employers, (5) the ratio of the number of foreign students and (6) the ratio of foreign teaching staff.

Although the method used by QS is highly controversial and has drawn sharp criticism , the Ministry of Research and several universities use these indicators as a basis for developing higher education development strategies.

Academic peer review is a problematic indicator when operationalized. To measure this indicator, QS surveyed university reputations by asking academics to name 10 local universities and 30 international universities which they felt were most reputable in the disciplines they studied. Therefore, QS does not only rank PTs in general, but also ranks them based on certain specific disciplines .

This method is very dubious , because of the high non-response rate. For example, for academic respondents from Asia, Australia and New Zealand , the response rate for the survey was less than 50%.

In addition, when giving an assessment, respondents were not given any information about the PT they were evaluating, so there was a possibility that the results of the assessment were contaminated with the halo effect or the impression of being biased by the big names of certain universities. Again, this problematic method may not provide real information about the quality of higher education, but instead reinforces the notion that the measurement is tainted with cognitive biases.

Infatuated with Scopus

The oddest part of this ranking is the component of the number of scientific work citations. QS siphons citation count data from Scopus , Elsevier’s commercial library database. Universities are competing to increase the number of indexed documents and the number of citations on Scopus as much as possible so that the score in this aspect can be optimal. This is the reason why Indonesian academics are “crazy” about Scopus.

Not enough there, the Ministry of Research has also created a research and publication performance measurement system called the Science and Technology Index (SINTA) score. The SINTA score is calculated from the number of documents and citations recorded by Google Scholar and Scopus, but gives much greater weight to documents and citations recorded by Scopus.

Based on the SINTA score, the Ministry of Research ranks researchers (individuals), PTs, and even scientific periodicals. The ministry’s dependence on bibliometrics such as the h-index (productivity and citation index) and SINTA scores in the long term actually makes the development of science and innovation stagnate .

This is because this system does not actually encourage researchers to improve the quality of their work, but instead creates excessive obsession in pursuing promotions and incentives to the point where they have to sacrifice their own integrity.

It’s no wonder that the German Science Foundation has banned bibliometric -based researchers’ performance evaluations even since 2013.

Unnatural cobra and citation effects

The anecdote of the cobra effect comes from stories from the period of British colonialization in India. The colonial government, which at that time was worried about the increasing population of wild cobras in Delhi, offered a reward for anyone who succeeded in hunting cobras. This strategy seemed to work at first, as soon as people flocked to bring in the dead cobras to get paid.

However, shortly after the colonial government found that many citizens were cheating, who deliberately bred cobras to get wages, the competition was finally stopped. Soon, the population of wild cobras increased dramatically, because no one wanted to hunt cobras anymore.

In the context of scientific publications, the cobra effect above is similar to the citation selfie phenomenon that researchers deliberately do to hoist the h-index . This makes the h-index provide misleading information about the reputation of the researcher, since citations should occur naturally.

The internationalization policy does provide a loose space for abuse of power. Managers of scientific periodicals will be tempted to require their authors to cite manuscripts that have previously been published in that publication. Lecturers will be tempted to pressure students to include their names as authors of scientific articles or to cite their writings, so that the h-index increases.

Even though this kind of practice has been banned by the Committee on Publication Ethics (COPE) , which is the association of editors and publishers of the world’s largest scientific periodicals, in fact this practice of abuse of power is often found in various universities.

This is not a typical problem that occurs in Indonesia. Robert J. Sternberg, professor of developmental psychology from Cornell University, was recently sued for more or less the same problem. Amazingly, in one of his articles, Sternberg cites 351 references, 161 of which are his own writings . Several of Sternberg’s writings were withdrawn from publication, as he was caught plagiarizing his own previously published work.

As a result, Sternberg was forced to give up his position as editor-in-chief of Perspective on Psychological Science , a well-known journal published by the Association of Psychological Science (APS). This embarrassing incident made him lose face, even though Sternberg was the main figure behind the very popular theory, namely, the Triangle of Love and the Three Aspects of Intelligence .

The Ministry of Research’s findings regarding self-citations are an early symptom of the cobra effect of the policies they implement themselves. Efforts to boost the number of publications that are too dependent on bibliometric measurements and providing incentives, both in the form of promotion and additional income, actually create opportunities for the effect of over-justification .

In social psychology, the effect of over-justification occurs when individuals are motivated by extrinsic rewards to perform a task. As a result, individuals will respond to tasks with very efficient strategies, but the resulting performance tends to be poor. To maintain individual motivation to continue to focus on the task, incentives must always be available. When incentives disappear, individual interest will also disappear.

The Ministry of Research shouldn’t be too surprised if PTs and researchers do everything they can to look cool. Focusing too much on measurements makes us obsess about cosmetic things, while forgetting about more substantial aspects. Academic integrity and independence, for example.

Open science as a solution

I believe the credibility and informational value of research lies in the integrity of the researcher . Integrity will be seen from the commitment of researchers to conduct research in a transparent and open manner.

For this reason, I propose a researcher performance appraisal system based on open science practices , a practice that requires researchers to make all their research materials and findings into public resources. The Center for Open Science (COS), a non-profit organization that advocates for open science practices, publishes the Transparency and Openness Promotion (TOP) Guidelines for researchers, PTs and managers of scientific periodicals who wish to adopt open science principles.

Researchers who openly report potential conflicts of interest, results of research ethics due diligence, initial assumptions including research hypotheses, data collection plans, planning the number of samples as well as rules for stopping data collection, research participant recruitment strategies, raw data, to data analysis procedures, are rewarded which is better than researchers who refuse to do so.

Open science-based policies are currently a trend since the replication crisis was widely discussed in 2011. The European Union is even more tactical by developing Plan S as a reference for evaluating the research they sponsor.

Some journals with high impact factors also provide open science badges to researchers who pre-register , share raw data and analysis procedures with the public.

I believe that in order for the research ecosystem in Indonesia to become progressive and dynamic, researchers and PT managers must adopt the principle of transparency. At the moment, there are only two choices – open science or being left behind completely.

Rizqy Amelia Zein , Assistant Lecturer in Social and Personality Psychology, Airlangga University

This article first appeared on The Conversation . Read the source article .

Leave a Reply

Your email address will not be published. Required fields are marked *

Discover more from Just Shared on Tel-U

Subscribe now to keep reading and get access to the full archive.

Continue reading