Criticism on Scholars’ Publication Culture and Mediums: Scopus, SINTA, Google Scholar

I’m very wide-eyed at the prospect of publishing my papers in these awesome research publication sites, so I wonder about the “cons”.

“Unreasonable” Self-Citation

October 2017, Jokowi was confused why only 3 higher educations made it into top 500 uni in the world according to Quacquarelli Symonds (QS). November 2018, we discovered there are ethical breaches in publication. Someone was found to have produced 69 research articles and 239 citations in a year, so they were suspected to have cited his own works in a “questionable” way.

(I’m not sure exactly why it’s questionable or unreasonable, since it’s wasn’t really elaborated in the source).

Doubtful Rating System

Quacquarelli Symonds (QS) is a global rating agency that Indonesian Research Ministry refers to evaluate higher education’s performance. Apparently, there’s no official document or statement indicating why our ministry chose QS and not Academic Ranking of World University (ARWU), Times Higher Education (THE), or Round University Rankings (RUR). QS ranks using a complex method with 6 assessment indicators that are apparently highly controversial and invited sharp criticism:

  1. Academic peer review
  2. Citations per faculty
  3. Lecturer and student ratio
  4. Reputation evaluation of employers
  5. Foreign student ratio
  6. Foreign teaching staff ratio

The glaring problem is that QS surveys university reputation by asking academics to state 10 local uni and 30 international uni that they think are most reputable in specific disciplines without giving the respondents any information about the higher educational institutions they’re evaluating so they risk halo effect, and unfortunately the number of non-response is high, barely reaching 50%.

Obsession with Scopus

Scopus is Elsevier’s commercial library database, and QS take citation data from Scopus. Universities compete in adding indexed documents and citations, and apparently this is why academics are “obsessed” with it.

(Is it bad that universities are encouraging academics to publish? Is it bad that teachers are encouraging students to reach the top? Is it bad that the world are encouraging youth and everyone to be leaders and productive people?)

SINTA

Research Ministry created Science and Technology Index (SINTA) which counts document number and citation of Google Scholar and Scopus, but emphasizes more on the latter’s data to rank individual researcher. This bibliometric system allegedly doesn’t improve the quality of researchers, but make them obsess with job promotion and sacrifice their own integrity. No wonder German Science Foundation forbade bibliometric-based evaluation of researchers since 2013.

There might be dirty data in Google Scholar profiles

Google is a private company, not a public service, and like any corporation the goal is to generate traffic 🙂 So Google computes influence, progress, identifies small circles of mutual citation (whether for good scientific reasons or for artificial reciprocal of boosting of h-index). Google Scholar may be missing citations to your work. If you have a common name (or it includes diacritics, ligatures, apostrophes), it’s likely you’ll end up with others’ publications in your Profile, which you are unfortunately responsible for identifying and removing.  As for the citations of your publications, well, Google Scholar claims to pull citations from anywhere on the scholarly web into your Profile, but their definition of “the scholarly web” is less rigorous than many people realize. For example, our co-founder, Heather, has citations on her Google Scholar Profile for a Friendfeed post. And others have found Google Scholar citations to their work in student handbooks and LibGuides, so the citation quality in Google Scholar isn’t 100% trustworthy.

Google Scholar citations are also, like any metric, susceptible to gaming. But whereas organizations like PLOS and Thomson Reuters’ Journal Citation Index will flag and ban those found to be gaming the system, Google Scholar does not respond quickly (if at all) to reports of gaming. And as researchers point out, Google’s lack of transparency with respect to how data is collected means that gaming is all the more difficult to discover.

The service also misses citations in a treasure-trove of scholarly material that’s stored in institutional repositories. Why? Because Google Scholar won’t harvest information from repositories in the format that repositories across the world tend to use (Dublin Core).

Google Scholar profiles aren’t permanent

It’s like monoculture in agriculture, which is when farmers identify the most powerful species of a crop–the one that is easiest to grow and yields the best harvest year after year–and then grow that crop exclusively. academia’s near-singular dependence on Google Scholar Profile data could be harmful to many if Google Scholar were to be shelved.

Google Scholar profiles won’t improve (apparently)

Google actively prevents anyone from improving their service. It’s been pointed out before that the lack of a Google Scholar API means that no one can add value to or improve the tool. That means that services like Impactstory cannot include citations from Google Scholar on Impactstory, nor can we build upon Google Scholar Profiles to find and display metrics beyond citations or automatically push new publications to Profiles. Based on the number of Google Scholar-related help tickets we receive, this lack of interoperability is a major pain point for researchers.

Google Scholar profiles only measure a narrow kind of scholarly impact

Google Scholar Profiles aren’t designed to meet the needs of web-native scholarship. These days, researchers are putting their software, data, posters, and other scholarly products online alongside their papers. Yet Google Scholar Profiles don’t allow them to track citations–nor any other type of impact indicator, including altmetrics–to those outputs.

Google Scholar Profiles also promote a much-maligned One Metric to Rule Them All: the h-index. We’ve already talked about the many reasons why scholars should stop caring about the h-index; most of those reasons stem from the fact that h-indices, like Google Scholar Profiles, aren’t designed with web-native scholarship in mind.

Sources:

4 reasons why Google Scholar isn’t as great as you think it is

https://theconversation.com/efek-kobra-dosen-indonesia-terobsesi-pada-indeks-scopus-dan-praktik-tercela-menuju-universitas-kelas-dunia-105808

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s