Tech Report

Mining Research Publication Networks for Impact

The question of how to evaluate the quality of research publications is very difficult to answer and despite decades of research, there is still no standard solution to this problem. Particularly at present, with the amount of scholarly literature rapidly expanding, it might become very difficult and time consuming to recognise what is the key research that presents the most important contributions to science. Furthermore, this question is highly relevant not only to researchers, but also librarians, publishers, editors and promotion and grant committees.

Currently, the most widely used methods for evaluating research publications are based mainly on citations. One of the crucial problems of this approach is the time delay between the date of publication and receiving the first citations. This delay complicates the process of finding recent relevant research. Moreover, citations included in a publication are based solely on the choice of the author and they don't necessarily indicate the quality of the cited paper. There are also significant differences between citation patterns in different fields of science.

Within this area we are interested in finding new methods which use semantically richer information for assessing quality. An example of such information might include semantic similarity of publications and analysis of their full-text, citation network analysis, for example finding bridges between distinct clusters of publications, etc. Main goals of this research include evaluating how the measures based on citations represent the quality of a paper and designing new methods for measuring quality that will address the challenges in this area.

This report reviews the state-of-the-art methods for evaluating science and research. In particular it focuses on research publications and other recorded information related to science. The issues, challenges and gaps in the current research are reviewed. The research proposal presented in this report is based on this review and gap analysis. The final part of the report presents the pilot study and the detailed plan for the next two years.


First Year Probation Report

ID: kmi-14-01

Date: 2014

Author(s): Drahomira Herrmannova

Download PDF

View By

Other Publications

Latest Seminar
Mr Antonello Meloni
Department of Mathematics and Computer Science, University of Cagliari, IT

Large Language Models for Scientific Question Answering: an Extensive Analysis of the SciQA Benchmark

Watch the live webcast


Knowledge Media Institute
The Open University
Walton Hall
Milton Keynes
United Kingdom

Tel: +44 (0)1908 653800

Fax: +44 (0)1908 653169

Email: KMi Support


If you have any comments, suggestions or general feedback regarding our website, please email us at the address below.

Email: KMi Development Team