`


THERE IS NO GOD EXCEPT ALLAH
read:
MALAYSIA Tanah Tumpah Darahku

LOVE MALAYSIA!!!


Friday, July 1, 2022

Academic research means little if they don't influence govt policies

 


Today I focus on quite a dry topic - academic social networks, altmetrics, and scientometrics in the world of scholarly research and writing. However, it should be of interest to scholars, career academics, and the larger public.

Also, those among us who understand the subtle differences between "popularity" rankings, "research impact" (impact factor metrics of scholarly journals) and "relevance of research to society" would appreciate this column.

Malaysian universities are most pedantic about altmetrics. Altmetrics is a portmanteau of alternative metrics which uses online interactions to measure research and scholarly publications’ impact.

For example, our public university promotions rely heavily on these figures. Often, both academic and public perceptions of the quality of our scholars are conditioned by these metrics.

Furthermore, there is often a direct correlation between the “intellectual respect” that academic peers have for one another, and such data. This correlation contributes, in part, to the toxic culture that prevails in the university environment.

Lecturers often have sleepless nights and high stress levels because they spend a career lifetime trying to increase these numbers, not to mention the time spent constantly googling their colleagues whose numbers may have overtaken theirs. Altmetric competition is rife in Malaysian academia!

What are these numbers all about?

There are a few platforms that gauge scholarly output. Research Gate (RG) is a European commercial social networking site for “scientists” and researchers to share articles, ask and answer questions, engage in critiques, and find academic collaborators.

RG’s website does not mention much about social scientists or social science research, and prioritises scientists and their scientific research. It is supposedly the largest academic social network in terms of active users, but not necessarily registered users.

Google Scholar (GS) probably has the largest number of registered users. It uses the h-index metric to assess output. Three other platforms, the Web of Science (WOS), Scopus, and the Arts and Humanities Citation Index, also do this. The h-index, developed by an American around 2005, is supposed to reflect the productivity of authors based on their publication and citation records.

Explaining productivity and citations

Citations indicate how many readers pay attention to what authors write. For example, Noam Chomsky of MIT is one of the most cited individuals globally. Between 1980 and 1992, he has had 3,874 citations in the Arts and Humanities Citation Index.

This makes him the most cited living person in a period of 12 years, and is the eighth most cited source on this planet. Neurologist Sigmund Freud is in top position, and philosopher Georg Hegel is in third place. All three are white, Western and male, but I shall reserve a separate discussion on this bias, in a future column.

On their opening web page, the GS announcement reads: “Google Scholar provides a simple way to broadly search for scholarly literature. From one place, you can search across many disciplines and sources: articles, theses, books, abstracts and court opinions, from academic publishers, professional societies, online repositories, universities and other websites. Google Scholar helps you find relevant work across the world of scholarly research”.

A cursory read of these websites reveals that they collect citations and the “number of reads” through an automated web crawling algorithm, to extract bibliographic data. In March, RG announced that they will discontinue the “RG score” which claims to assess the quality of scholarly output (publications). Part of what they said in their announcement is: "... together with community feedback, we made the decision to remove the RG Score.

“We will soon update your profile to showcase a set of metrics to allow a more holistic assessment of your research impact. This will include Research Interest, h-index, citations, reads, and recommendations forming the core set of metrics that we use to represent your impact to you and others on the platform."

What is RG's alternative? They had this to say: "For those who want the benefit of having a metric to quickly assess their own and others’ profiles to understand their contributions to science, we would encourage the adoption of Research Interest, a metric that captures the interest in a researcher’s work within the scientific community.

“We believe that this holistic approach, backed up by our new metrics criteria, provides a better way forward for everyone on ResearchGate”.

The following key phrases in all these websites are worth mentioning. They are: “contributions to science”, “a metric that captures the interest in a researcher’s work within the scientific community”, “indication of a member’s impact”, “a holistic approach”, and “capturing how many are paying attention to what these authors write”.

In the Malaysian context, there is a mammoth gap between a Malaysian scholar’s altmetrics and the relevance of their research and publications in society. What’s worse is that there is hardly any discussion about this glaring gap.

While a high h-index for brilliant scholars such as Chomsky or India’s CNR Rao is worth acknowledging, using this to gauge Malaysian scholars is meaningless. Not all with soaring numbers have engaged in relevant research that could solve the endemic crisis that is snowballing in our society.

Social problems are worsening, and sound economic planning remains short-sighted. Ad-hoc policies with ethnocentric and religious undertones are not thought through by selfish and clueless politicians. All these continue to divide the nation. To make matters worse, our universities have not stepped up to save the day.

Where are the social scientists whose research and published papers provide solutions to these problems? At this stage of our national crisis, their RG and h-index scores are grossly irrelevant.

When will our university administrators and ministers acknowledge that these metrics, including university rankings, have failed to capture research and publication quality?

Ideally, scholarly research, and what academics write, speak about, and teach must always be for the benefit of the common people. In the Malaysian context, this is rarely the case, especially in the social sciences where it is needed the most. - Mkini


SHARIFAH MUNIRAH ALATAS is an academician with zero tolerance for corrupt, arrogant and frivolous leadership.

The views expressed here are those of the author/contributor and do not necessarily represent the views of MMKtT.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.