Skip to main content
main-content

Über dieses Buch

This book constitutes the refereed proceedings of the International Workshop on Altmetrics for Research Outputs Measurements and Scholarly Information Management, AROSIM 2018, held in Singapore, in January 2018.

The 7 revised full papers presented together with two keynote papers and one introduction paper were carefully reviewed and selected from 20 submissions. The workshop will investigate how social media based metrics along with traditional and non-traditional metrics can advance the state-of-the-art in measuring research outputs.

Inhaltsverzeichnis

Frontmatter

Introduction

Frontmatter

Introduction to the Workshop on Altmetrics for Research Outputs Measurement and Scholarly Information Management (AROSIM 2018)

Abstract
“Altmetrics” refers to novel metrics, often based on social media, for measuring the impact of diverse scholarly objects such as research papers, source code, or datasets. Altmetrics complement traditional metrics, such as bibliometrics, by offering novel data points to give a more holistic understanding of the outreach of research outputs. This increased awareness could help researchers and organizations to better gauge their performance and understand the outreach and impact of their research. Altmetrics is related to diverse research areas such as informetrics, social network analysis, scholarly communication, digital libraries, and information retrieval. This workshop on Altmetrics for Research Outputs Measurement and Scholarly Information Management (AROSIM 2018) aims to give a multi-disciplinary perspective on this new and cutting-edge research area by addressing a range of opportunities, emerging tools and methods, along with other discussion topics in the fields of altmetrics, traditional metrics, and related areas.
Mojisola Erdt, Aravind Sesagiri Raamkumar, Edie Rasmussen, Yin-Leng Theng

Keynote Papers

Frontmatter

Using Altmetrics to Support Research Evaluation

Abstract
Altmetrics are indicators that have been proposed as alternatives to citation counts for academic publication impact assessment. Altmetrics may be valued for their speed or ability to reflect the non-scholarly or societal impacts of research. Evidence supports these claims for some altmetrics but many are limited in coverage (the proportion of outputs that have non-zero values) or ability to reflect societal impact. This article describes data sources for altmetrics, indicator formulae, and strategies for applying them for different tasks. It encompasses traditional altmetrics as well webometric and usage indicators.
Mike Thelwall

Towards Greater Context for Altmetrics

Abstract
This paper argues for the need for scientometricians to borrow from qualitative analytical approaches to improve the field of altmetrics research. Existing approaches to contextualizing metrics using quantitative means, by researchers and altmetrics data providers alike, are discussed and critiqued. I propose the need for research based in ways of thinking from the humanities and social sciences to understand human motivations for sharing research online to truly understand what altmetrics mean and how they should be interpreted. I explore several ways that the research community and data providers alike can help those who use altmetrics in evaluation move from “data” to “wisdom”.
Stacy Konkiel

Full Papers

Frontmatter

Monitoring the Broader Impact of the Journal Publication Output on Country Level: A Case Study for Austria

Abstract
This study provides an example of a monitoring practice concerning the broader impact of the journal publication output on country level. All Austrian publications of the last three publication years indexed in WoS Core Collection and mapped to a DOI were analysed in PlumX. The metrics traced in the different data sources were compared for six main knowledge areas. The results reinforce the importance of the usage metrics especially in disciplines related to the area “Arts & Humanities”. The highest data coverage is provided by the number of readers in Mendeley. The percentage of publications with social media scores, especially tweets, has been significantly increasing within the last three years, in agreement with the increasing popularity of these tools in recent years. The highest values for social media are reported for the Health and Life Sciences, followed very closely by the Social Sciences. The relative insignificance in the Arts & Humanities’ is noteworthy. Our study confirms very low correlation values between the different measures traced in PlumX and supports the hypothesis that these metrics should rather be considered as complementary sources. High correlations between the same measures or metrics originating from different data sources were only reported for citations, but not for usage data. Medium correlation values were observed between usage and citation counts in WoS Core Collection. No association of the number of co-authors or co-affiliations with any of the measures considered in this study could be found, except for a low correlation between the number of affiliations and captures or citations.
Juan Gorraiz, Benedikt Blahous, Martin Wieland

New Dialog, New Services with Altmetrics: Lingnan University Library Experience

Abstract
This paper chronicles Lingnan University library’s approach to integrating altmetrics into its institutional repository, triggering new dialog with Lingnan’s researchers and to re-energize their attention on the scholarly communication services with new perspective.
Sheila Cheung, Cindy Kot, Kammy Chan

How Do Scholars Evaluate and Promote Research Outputs? An NTU Case Study

Abstract
Academic scholars have begun to use diverse social media tools to disseminate and evaluate research outputs. Altmetrics which are indices based on social media, have emerged in recent years and have received attention from scholars from a variety of research areas. This case study aims to investigate how researchers at Nanyang Technological University (NTU) promote their research outputs and how they conduct research evaluation. We conducted semi-structured interviews with eighteen NTU researchers. Results from the study show that NTU researchers still prefer traditional metrics such as citation count, journal impact factor (JIF) and h-index for their research evaluation. They find these metrics are essential determinants to identify the quality of a research paper. In terms of altmetrics, most NTU researchers interviewed are aware of them, but have not yet used them in research evaluation. Furthermore, we found that the main methods used to promote research outputs are attending academic conferences and publishing papers in journals. However, NTU researchers also embark on using social networking sites to disseminate new findings. The results of this study contribute to the previous literature to help understand how social media and altmetrics are influencing the traditional methods of promoting and evaluating research outputs in academia.
Han Zheng, Mojisola Erdt, Yin-Leng Theng

Scientific vs. Public Attention: A Comparison of Top Cited Papers in WoS and Top Papers by Altmetric Score

Abstract
Alternative metrics or altmetrics are article level metrics that have been used to quantify the attention given to scholarly papers in online fora or social media. In a way it was sought to replace journal level metrics such as the Impact Factor and also to see if altmetric scores could predict highly cited papers. Early studies, done soon after altmetrics was proposed in 2010, were somewhat premature as the use of social media was not widely prevalent, and their results may no longer be relevant at the present time when use levels have risen significantly. In 2016, Altmetric.com has tracked over 17 million mentions of 2.7 million different research outputs. Of these, the top 100 most-discussed journal articles of 2016 were presented with details of journals in which the research was published, affiliations of authors, fields, countries, etc. Our attempt is to obtain a bibliometric profile of these papers with high altmetric scores to see the underlying patterns and characteristics that motivate high public attention. In parallel we have analyzed top 100 highly cited papers from the same year. Our objective in this empirical study is to examine the similarities and distinguishing features of scientific attention as measured by citations and public attention in online fora.
A significant finding is that there is very little overlap between very highly cited papers and those that received the highest altmetric scores.
Sumit Kumar Banshal, Aparna Basu, Vivek Kumar Singh, Pranab K. Muhuri

Field-Weighting Readership: How Does It Compare to Field-Weighting Citations?

Abstract
Recent advances in computational power and the advancement of the internet mean that we now have access to a wider array of data than ever before. If used appropriately, and in conjunction with peer evaluation and careful interpretation, metrics can inform and enhance research assessment through the benefits of being impartial, comparable, and scalable. There have been several calls for a “basket of metrics” to be incorporated into research evaluation. However, research is a multi-faceted and complex endeavor. Its outputs and outcomes vary, in particular by field, so measuring research impact can be challenging. In this paper, we reflect on the concept of field-weighting and discuss field-weighting methodologies. We study applications of field-weighting for Mendeley reads and present comparative analyses of field-weighted citation impact (FWCI) and field-weighted readership impact (FWRI). We see that there is a strong correlation between the number of papers cited and read per country. Overall, per subject area for the most prolific countries, FWCI and FWRI values tend to be close. Variations per country tend to hold true per field. FWRI appears to be a robust metric that can offer a useful complement to FWCI, in that it provides insights on a different part of the scholarly communications cycle.
Sarah Huggett, Chris James, Eleonora Palmaro

A Comparative Investigation on Citation Counts and Altmetrics Between Papers Authored by Top Universities and Companies in the Research Field of Artificial Intelligence

Abstract
Artificial Intelligence is currently a popular research field. With the development of deep learning techniques, researchers in this area have achieved impressive results in a variety of tasks. In this initial study, we explored scientific papers in Artificial Intelligence, making comparisons between papers authored by the top universities and companies from the dual perspectives of bibliometrics and altmetrics. We selected publication venues according to the venue rankings provided by Google Scholar and Scopus, and retrieved related papers along with their citation counts from Scopus. Altmetrics such as Altmetric Attention Scores and Mendeley reader counts were collected from Altmetric.com and PlumX. Top universities and companies were identified, and the retrieved papers were classified into three groups accordingly: university-authored papers, company-authored papers, and co-authored papers. Comparative results showed that university-authored papers received slightly higher citation counts than company-authored papers, while company-authored papers gained considerably more attention online. In addition, when we focused on the most impactful papers, i.e., the papers with the highest numbers of citation counts, and the papers with the largest amount of online attention, companies seemed to make a larger contribution by publishing more impactful papers than universities.
Feiheng Luo, Han Zheng, Mojisola Erdt, Aravind Sesagiri Raamkumar, Yin-Leng Theng

Scientometric Analysis of Research Performance of African Countries in Selected Subjects Within the Field of Science and Technology

Abstract
This paper assesses the performance of African countries in selected field of Science and Technology (S&T) over the last twenty years. The purpose is to determine the readiness of these countries in aligning to the strategic direction set by African Union (AU 2063) agenda. The AU 2063 aims to emplace a paradigm shift from the current structure where its members’ dependence on natural resources to drive their economies to one that is knowledge-based. It thus set pillars for archiving this feat and they include; building and/or upgrading research infrastructures; enhancing professional and technical competencies; promoting entrepreneurship and innovation; and providing an enabling environment for STI development in the African continent. Data used for the study were retrieved from the SCImago database which comprises a total of seven subject areas cutting across 126 subject categories. In SCImago database, information was also searched on S&T performances with respect to publications in the World and Africa, over the last 20 years period starting from 1996–2015. Microsoft Excel was used to analyse the data collected. Results were presented in tables and figures on the top 10 most productive African countries in the field of S&T in all the seven selected subject areas. The paper suggested an intra-African collaborative effort between low and high performing countries in Africa as an option for developing the needed knowledge capacities for realising its regional developmental agenda (AU 2063).
Yusuff Utieyineshola

Backmatter

Weitere Informationen

Premium Partner

    Bildnachweise