Guide2Research adheres to high standards and transparent procedures based on well-established metrics in order to produce a number of rankings for the research community in the arena of computer science and electronics. 

Guide2Research depends on various sources of data that are cross-correlated followed by rigorous inspection and verification in order to ensure the validity and reliability of our rankings. The main objective is to promote high-quality venues of research as well as establish a place for young scholars to be inspired by leading scientists.

Top Computer Scientists Ranking Methodology

For the ranking of top scientists which was launched initially in 2014, the H-Index is used to rank scholars in descending order combined with the total number of citations. 

What is H-index?

The H-Index is an indicative measure that reflects the number of influential documents authored by scientists. It is computed as the number h of papers receiving at least h citations [3].  The H-index and citation data is obtained from Google Scholar which is the largest bibliometric database. 

As opposed to other indexing datasets with H-index data, Google Scholar is a free service and well-accepted within the research community allowing scholars to create a profile that they can maintain themselves on the basis of ethical procedures.

DBLP Cross-correlation

In order to ensure that only genuine profiles are listed, we verify every profile manually in addition to cross-correlating it against other reliable sources including DBLP which is an indexing dataset for computer science-related publications. The number of documents indexed on DBLP for a scientist gives an indication of their contribution to research. We also conduct a rigorous  search on scientists for the awards they have received from leading research institutions and government agencies. 

Approval Threshold

The threshold for accepting a scientist to be listed is set to 40 provided that most of their publications are in the area of computer science and indexed in DBLP. Without a doubt, numbers are never to be an obsolete measure to quantify the precious contributions of scientists, the threshold value of 40 is based on the recommendation reported by J. E. Hirsch in his h-index paper that an h-index of 40 characterizes outstanding scientists [1].

Top Computer Science Conferences Ranking Methodology

Guide2Research indexes major conferences in the area of computer science and electronics. The ranking of top conferences is based primarily on the H5-index indicator provided by Google Scholar [4] in tandem with other valuable indicators including the indexing of proceedings, sponsoring bodies, number of editions, and the profiles of its steering committees. 

What is H5-index?

The H5-index h for a conference series is computed based on the articles published in one of the conference editions during the last 5 years as the largest number h of conference articles having at least h citations. 

Sponsorship and indexing requirements

For the technical sponsor and indexing of conference proceedings, the majority of top conferences indexed by Guide2Research are sponsored or indexed by leading and well-respected publishers and academic organizations including IEEE, ACL, Springer, AAAI, USENIX, Elsevier, ACM and LIPIcs

Scope of topics

Based on the topics and scope of research areas accepted within conferences, rankings of top conferences are provided for major categories in computer science and electronics. This includes, for instance, machine learning, software engineering, computer vision, and signal processing. 

Computer Science University Ranking Methodology

Due to the need to establish a transparent framework for ranking universities based on objective and well-established metrics for different disciplines, Guide2Research is the only ranking platform that capitalizes the human as a valuable asset to research institutions and offers its data and procedures publicly in a completely transparent way. 

Scholar reputation metric

The ranking provided by Guide2Research for research institutions is purely based on the reputation of its scholars. We highly believe that companies should be valued based on the talents and reputation of its staff. 

Because the ranking problem is compound by subjectivity between different experts on defining and perceiving the quality and impact of educational institutions, major companies providing mainstream rankings do not fully elaborate their ranking procedures nor do they offer their raw data. Besides, existing university rankings rely mostly on declarative and subjective analysis of data. 

The first edition of the university ranking was released in 2020 covering over 591 research institutions and was limited to computer science. The ranking is based on simple metrics highly related to the reputation of staff in addition to research outputs as elaborated below.

How university reputation metric is calculated

  • The sum of H-index for all top and leading scholars within a given institution.
  • The number of top scientists having an h-index of 40 and over.
  • The number of documents published by leading scientists and indexed on DBLP
  • The sum of citations for all top scholars within a particular research institution.

Correlation with other rankings

Based on cross-matching analysis, the ranking of universities provided by Guide2Research in its first edition correlates consistently for the case of top universities with mainstream rankings maintained by leading companies with decades of experience in the field. This includes QS, USNews, and TimesHigherEducation (THE).

Computer Science Journals & Special Issues Ranking Methodology

Guide2Research offers a list of top computer science journals that are selectively reviewed annually based on a number of indicators related to the quality of accepted papers and reputation of the journal. 

As there is no panacea in finding the magical metric or the analytical tool to produce a consensual score to please all experts from different disciplines, Guide2Research adheres to a strict policy for indexing journals based on the following metrics.

Journal and special issue metrics used

  • G2R Score : is a novel bibliometric indicator that quantifies the endorsement level of top and well-respected scientists for a given journal. It is estimated  based on the following two factors for data published during the last three years:
    – H-IndexValue : Estimated h-index from publications made solely by top scientists – NumberTopScientists : Number of scientists who have published in the journal.

  • Impact Factor: is a measure that reflects the average number of citations an article in a particular journal would receive per year. It is computed as the sum of citations for all journal documents published during the two preceding years divided by the total number of articles published during the same period. This is elaborated in the following equation:

    The impact factor is computed by the Web of Science and updated annually [5].
  • Scopus CiteScore:  is a bibliometric indicator proposed by Elsevier to measure the impact of scientific journals based on citation analysis. CiteScore is computed as the ratio of the number of citations recorded in the Scopus database divided by the number of articles published in the preceding four years as shown below:

  • SJR: is a metric developed by SCIMAGO research laboratory using Scopus data [6]. The measure indicates the scientific influence or prestige of an academic journal based on two factors which are the number of citations in tandem with the source of where the citations come from.
  • Article Processing Charges (APC): Journals from a publishing house that requires an APC to be paid by authors to publish their research, are reviewed on a case by case basis to ensure that they are not predatory publishers or vanity press whose main goal is to make a profit at the expense of quality and research contributions [7].
  • Editorial board: The editor-in-chief and members of the editorial board for the journal are also inspected against different bibliometric sources to ensure that the journal is led by experts in the area of research related to the journal.

References:

[1] https://www.pnas.org/content/102/46/16569.full

[2] https://en.wikipedia.org/wiki/Impact_factor

[3] https://en.wikipedia.org/wiki/H-index

[4] https://scholar.google.com/intl/en/scholar/metrics.html

[5] https://en.wikipedia.org/wiki/Web_of_Science 

[6] https://www.scimagojr.com/aboutus.php 

[7] https://en.wikipedia.org/wiki/Predatory_publishing