Thursday, April 25, 2024
spot_img
HomeAsiaEmerging Europe & Central Asia EECA Top 10.000 Scientists AD Scientific Index

Emerging Europe & Central Asia EECA Top 10.000 Scientists AD Scientific Index

Why is the “AD Scientific Index” needed?
The “AD Scientific Index” is the first and only study that shows the total and the last five-year productivity coefficients of
scientists based on h-index and i10 index scores and citations in Google Scholar. Furthermore, the index provides the ranking and
assessment of scientists in academic subjects and branches and in 13,600 universities, 206 countries, regions, and the world. In
other words, the “AD Scientific Index” provides both the ranking and analysis results.
“AD Scientific Index” (Alper-Doger Scientific Index):
This new index has been developed by Prof. Dr. Murat ALPER (MD) and Associate Prof. Dr. Cihan DÖĞER (MD) by using the
total and last 5 years’ values of the i10 index, h-index, and citation scores in Google Scholar. In addition, the ratio of the last 5 years’
value to the total value of the abovementioned indexes is used. Using a total of nine parameters, the “AD Scientific Index” shows the
ranking of an individual scientist by 12 subjects (Agriculture & Forestry, Arts, Design and Architecture, Business & Management,
Economics & Econometrics, Education, Engineering & Technology, History, Philosophy, Theology, Law / Law and Legal Studies, Medical
and Health Sciences, Natural Sciences, Social Sciences, and Others), 256 branches, 13,600 institutions of employment, 206 countries,
11 regions (Africa, Asia, Europe, North America, South America, Oceania, Arab Leageu, EECA, BRICS, Latin America, and COMESA),
and in the world. Thus, scientists can obtain their academic rankings and monitor developments in the ranking over time.
Data Collection and Standardization:
Collecting data manually based on the ranking from Google Scholar, the profiles with up to 300 citations and verified addresses
or the profiles that build confidence for their accuracy are listed primarily. Thus, it is aimed to standardize the names, institutions,
and branches as much as possible. Non-standardized data including wide ranges of variations in the information and the use of
abbreviations and a variety of languages have caused difficulties. Performing data mining and scrutinizing the acquired information,
many profiles were excluded from the index. Furthermore, some of the profiles were excluded during the regular examination of the
data onward. Data cleaning requires a regular process in place to be conducted meticulously. We welcome your contributions in data
cleaning and ensuring accuracy.
Determining the subjects/departments, to which scientific fields would belong, may seem easy in some branches and in a
variety of countries. However, it may create considerable confusion in some other countries, regions, and schools. We would like to
emphasize that the following fields including Engineering, Natural and Environmental Sciences, Biology, and Biochemistry, Material
Science, Chemistry, and Social Sciences may exist in quite variable spectrums in different countries. Therefore, we would like to
stress that the standardization of subjects and branches has not been easy. To perform standardizations, we accepted the official
names of the institutions and academic branches as accurate in the way that they were specified on the university website. We have
developed this strategy in order to standardize this complex situation at least partially.
Studies that influence the order of ranking because of a high number of citations received, in a manner similar to
CERN:
We started a procedure to add an asterisk as “*” at the end of the names of the authors when a scientific paper of interest
included many authors such as CERN, ATLAS, ALICE, CMS, Statistical Data, Guideline, Updates etc. scientific papers. We think that
new criteria will be defined to be implemented for such studies. Until further criteria are described, we marked such studies with a
“*” sign.
Profile information and ethical responsibility:
The ethical responsibility for the correct profile information rests entirely with the relevant scientist. However, we think that it
would be prudent for institutions, countries, and even branch associations to conduct periodic reviews of scientist profiles affiliated to
the respective organization since misleading information may compromise the reputation of the organization or the country.
Organizations should also review profiles to identify and report scientists, who are not affiliated with the respective institution. In order
to avoid any compromise to the institutional reputation, institutions should take necessary corrective and preventive actions against
published scientist profiles arranged unethically.
Data Cleaning and the Redlist
Data cleaning is a dynamic process that we systemically perform continuously. Despite all our best efforts, we may not be
completely accurate and we welcome your contributions to the redlist notifications. Rarely, some scientists are included in the redlist
due to innocent mistakes with good intentions and no unethical behavior. Most errors result from inadequate periodic profile checks.
However, the correction of such an error is easy through the submission of a correction request. In order to avoid such an undesirable
situation to occur, scientists should regularly check their profiles and institutions should review the profiles of the staff systematically.
Ranking Criteria:
Ranking of scientists by the university, country, region, and in the world was performed based on the “total h-index”. The
“total h-index” was used in rankings by the branch and the subbranch.
The ranking criteria based on the “total h-index” scores were used in the following order: Firstly, the “total h-index” scores;
secondly, the total number of citations; and thirdly, the “total i10 index” scores (1. Total h-index scores, 2. Total number of citations,
3. Total i10 index scores, 4. Last 5 years’ h-index scores).
Ranking based on the last 5 years’ h-index scores was performed using criteria in the following order: 1. Last 5 years’ h-index
scores, 2. Number of citations in the last 5 years, 3. Last 5 years’ i10 index scores, 4- Total h-index scores.
The ranking criteria for the total i10 index were used in the following order: 1. Total i10 index scores, 2. Total h-index scores,
3. Total number of citations, and 4. Last 5 years’ i10 index scores.
Ranking based on the last 5 years’ i10 index scores was performed using the criteria in the following order: 1. Last 5 years’
i10 index scores, 2. Last 5 years’ h-index scores, 3. Number of citations in the last 5 years and 4. Total i10 index scores.
Ranking based on the total number of citations was performed using the criteria in the following order: 1. Total number of
citations, 2. Total h-index scores, 3. Total i10 index scores and 4. Number of citations in the last 5 years.
Ranking based on the total number of citations in the last 5 years was performed using the criteria in the following order: 1:
Number of citations in the last 5 years, 2. Last 5 years’ h-index scores, 3: Last 5 years’ i10 index scores and 4. Total number of
citations
Why are the last 5 years’ ratios / total ratios important?
The h-index, i10 index, and citation the last 5-year ratios/total ratios are major unique characteristics of the AD Scientific
Index, showing both the development in the individual performance of the scientist and the reflections of the institutional policies of
universities onto the overall scientific picture.
Productivity Rankings
Productivity Rankings is a unique service offered only by “AD Scientific Index”. This is a ranking system derived from the i10
index in order to show the productivity of the scientist in publishing scientific articles of value. Productivity Rankings is an instrument
that lists productive scientists in a given area, discipline, university, and country and can guide the development of meaningful
incentives and academic policies. The world rankings, regional rankings, and university rankings of scientists in this table are
developed based on the total i10 index.
Academic collaboration
Scientific fields of interest specified in the profiles of scientists are available for other scientists from different countries and
institutions to enable academic collaboration.
Ranking Criteria for Top Universities:
In the presence of many different university ranking systems, as the “AD Scientific Index”, we have developed a ranking
system with a different methodology based on the principle of including only meritorious scientists. Based on Google Scholar’s total
h-index scores, we have listed all academicians, who are ranked in the world in the top 10,000 and top 100,000 in university rankings.
Furthermore, we have listed the breakdown of this ranking by main subjects. As the order of ranking principles, we used the overall
top 10,000 scientists list primarily. Secondly and thirdly, we used the ranking in the top 100,000 and top 200.000 scientists list.
Fourthly, the total number of scientists in the AD Scientific Index was ranked by the university. In the case of equalities within a
university ranking, we used the highest rank of the scientist in the respective university as it is listed in the world ranking.
You may sort the ranking from the highest score to the lowest or vice versa in any of these fields. You can observe the fields,
which move the respective university to the forefront. Furthermore, the name of the academician with the highest total h-index in
the respective university is displayed with the world ranking. Top University Ranking by “AD Scientific Index” will not only list the
areas, where a university is the best or has room for improvement, but also reflect the outcomes of scientist policies of the institutions.
This report reveals the competency of institutions to attract prized scientists and the ability of institutions to encourage advances and
retain scientists.

Emerging Europe & Central Asia EECA Top 10.000 Scientists AD Scientific Index 

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Most Popular

Recent Comments